开发者

How many bits are reserved for Webkit (Google Chrome) timestamps?

I know that Google Chrome uses an integer timestamp, properly called the Webkit timestamp, that is calculated by the number of microseconds since 01/01/1601 00:00:00 UTC. What I'm not sure is whether this is a 64-bit signed integer (which would make the most sense) or a 56-bit integer?

Here'开发者_Go百科s an example timestamp: 12883423549317375. This decodes as Sun, 05 April 2009 16:45:49 UTC. Any good reference out there for how this works? I searched the Webkit website and found no documentation of this timestamp.


Time in Chromium is generally represented internally as an int64. Take a look at base::Time and the various platform-specific implementations for details about how the conversions take place.


In addition, as these timestamps are often found in SQLite databases (in Chrome data) I often have to find a way to decode them on-the-fly. One of my most-visited bookmarks is at http://linuxsleuthing.blogspot.co.uk/2011/06/decoding-google-chrome-timestamps-in.html which tells you how to do this as part of an SQL query.

SELECT datetime((time/1000000)-11644473600, 'unixepoch', 'localtime') AS time FROM table;

Where time is the name of the column the webkit timestamp is stored in.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜