Why this operation with date (number of days between 2 dates) return this value?
According to this question, I wrote "my code" (without Math.abs, I don't need it) :
var oneDay = 24 * 60 * 60 * 1000; // hours*minutes*seconds*milliseconds
var firstDate = new Date("2011", "09", "28"); // 28 september 2011
var secondDate = new Date("2011", "09", "30"); // 30 september 2011
var notti = ((secondDate.getTime() - firstDate.getTime()) / (oneDay));
if (notti < 1)
notti = 1;
else
notti = Math.round(notti);
alert(notti);
and it print 2 (correct).
Now, If I do this :
var oneDay = 24 * 60 * 60 * 1000; // hours*minutes*seconds*milliseconds
var firstDate = new Date("2011", "09", "28"); // 28 september 2011
var secondDate = new Date("2011", "10", "01"); // 01 october 2011
var notti = ((secondDate.getTime() - firstDate.getTime()) / (oneDay));
if (notti < 1)
notti = 1;
else
notti = Math.round(notti);
alert(notti);
开发者_如何学编程
it print 4. Why 4? It should be 3... Do you know about this problem?
The month argument in the date constructor (and other date methods) runs from [0.11] not [1..12] so:
new Date("2011", "09", "28"); // 28 september 2011
is actually Fri Oct 28
, not September.
Javascript months are zero based. So October has 31 days.
new Date("2011", "9", "31"); // October 31st
Because...
new Date("2011", "09", "28").toString()
... returns:
Fri Oct 28 2011 00:00:00 GMT-0400 (EDT)
This is because JavaScript Data
is based on the Java Date object, which is a mess. See also "Puizzle 61: The Dating Game" in the book JavaPuzzlers for an explanation.
精彩评论