开发者

Unix time oddity, or something else?

I'm writing an app in C# that calls methods on a 3rd party web service (but this question is language independent). One of the methods I have to call sets a "start time." The method expects the date and time to be passed as a long, in Unix time format (seconds since midnight on 1/1/70).

The code examples I received from their dev team use the Java getTime() function, which as far as I can tell does indeed return a long representing the Unix time. So for example if I want to set the start time to 2/28/11 at 5pm I would pass it 1298912400. However this doesn't work. Their service doesn't return an error, but if I go look at the web gui to confirm, the start time is blank.

Now, if I use their web gui to manually set the start time to 2/28/11 at 5pm their log shows it as 1298930400000. First, this number is 3 digits too long and second, even if I remove the extra zeros the number equates to 2/28/11 but at 11pm.

So my question is, are they doing something internally that I don't know about, or am I missing something that someone here can point out to me?

Updated to add code:

TimeSpan ts = (new DateTime(2011, 2, 28, 17, 0, 0, DateTimeKind.Utc) - new DateTime(1970, 1, 1, 0, 0, 0, DateTimeKind.Utc));
long goodtime = 开发者_JAVA技巧(long)ts.TotalMilliseconds;

That returns 1298912400000 for me but they say it should be 1298930400000 so what am I doing wrong?


They are using milliseconds, and you probably aren't compensating for timezone.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜