开发者

javascript bigmath issue

I have the following piece of code to calculate how far back in the future a day lies

    var totalPixels = this._endX - this._startX;
    var segmentPixels = this._nowX - this._startX;
    var startTime = this._nowDate.getTime() - Timeline._minToMs((segmentPixels/totalPixels*this._rangeInMinutes));
    //console.log(this._nowDate.getTime() + '\n' + Timeline._minToMs(segmentPixels/totalPixels*this._rangeInMinutes));
    this._startDate.setTime(startTime);
    console.log(this._rangeInMinutes + ', ' + this._startDate.toString());

Typical values are

endX: 1037

startX: 40

nowX: 134.625

rangeInMinutes ranges from 1 to the number of minutes in a year.

minToMs simply multiplies the input by 60*1000.

My output varies from my expected output by roughly a factor of ten. Am I experiencing a rounding error, or some sort of truncation 开发者_JAVA技巧in the process of converting minutes to milliseconds?

P.S.: A fiddle of what I'm trying to do is http://jsfiddle.net/NRx9m/21/


I assume your output is startTime and I assume that _nowDate is as I'd expect, the date now.

So Timeline._minToMs((segmentPixels/totalPixels*this._rangeInMinutes)); should be your approximate year you are expecting to take off from now?

This figure is going to change depending on rangeInMinutes. From the figures you've given the value you are subtracting is 0.0879*_rangeInMinutes and then converted to ms.

If you are getting a value of a month before then I'd assume that your value of rangeInMinutes is approximately 500,000.

With a given value of 500,000 it will be about 30.5 days before now that you will get out.

Could you describe the maths you use to explain why you think that your output should be a year before now (apparently with no variation in the rangeInMinutes variable).

Alternatively hearing about what your rangeInMinutes is might help too...

Edit:

OK. Looking at the code in the fiddle, playing around a bit and seeing what its up to I think it does what I think I'd expect...

Looking at the core bits of this as follows:

I'm assuming that startX, endX and nowX represent a pixel position on a timeline chart.

// number of pixels representing the range from 
var totalPixels = endX - startX; startdate to enddate
// Number of pixels representing the range from startdate to nowdate
var segmentPixels = nowX - startX;

The following I have rearranged slightly so I can label it more intuitively (I hope)

var startTime = nowTime 
    - Math.floor(
        (rangeInMilliseconds / totalPixels) //time per pixel
        * segmentPixels // pixels between now and start
        );

You could also work out endTime if you wanted to:

var endTime = nowTime 
    + Math.floor(
        (rangeInMilliseconds / totalPixels) //time per pixel
        * (endX-nowX) // pixels between now and start
        );

Here's a fiddle I've done - http://jsfiddle.net/MWCbW/

As you can see the startdate is december and the end date is november. The reason for not being an exact year is because your year is 48 weeks long due to the way you've worked out second in a year (or month more precisely).

So I reckon that the year difference is there. Its just that you're perhaps not following where it is exactly.... Obviously you need to use a scale of 100 to get this result since otherwise our daterange will be something less than a year.


Posting a working example for Joshua. I do not see what could be the problem other than the decimal point in nowX

<script>
var x = new Object();
x._minToMs=function(min){  return min*60000; }
x._endX = 1037;
x._startX = 40
x._nowX = 134625; // WITHOUT the decimal point, this gives months instead of weeks
var totalPixels = x._endX - x._startX;
segmentPixels = x._nowX - x._startX;
x._nowDate=new Date();
x._startDate=new Date();
x._rangeInMinutes = 1000;
var startTime = parseInt(x._nowDate.getTime() - x._minToMs((segmentPixels/totalPixels*x._rangeInMinutes)));
x._startDate.setTime(startTime);
alert(x._rangeInMinutes + ', ' + x._startDate.toString());

</script>
0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜