开发者

What's the minimum lag detectable by a human? [duplicate]

This question already has answers here: Closed 11 years ago.

Possible Duplicate:

What is the shortest perceivable application response delay?

I've been profiling some JavaScript UI code because it feels a little laggy. So far, I've found some bottlenecks and optimized them out, but I'd like to define a measurable requirement for this.

How quickly should a response occur in order for a human not to notice lag? For example, what's the minimum detectable delay between when a keyboard key is pressed and when a letter app开发者_如何学JAVAears on the screen? At what point is further optimization not going to make any difference to a human?

A lot of monitors have a refresh rate at about in the 60-120Hz range. Does that mean the magic number is around 8-16ms?


Considering the "press key" event and the letter appearing on the screen as two separate frames, means that, if the user presses a key while looking at the screen, he will want to see it exactly afterwards. This "exactly afterwards" means it should have a 60 Hz response time or higher.

For this reason, a 8-16 ms value should indeed be aimed for, since it will result in the same effect one sees in movies. In other words, the user will have no perception of delay for such values.

However, you must keep in mind that the keyboard has a polling time of its own, and that additional delays not necessarily connected with the script itself may interfere in its time. For those reasons aiming for values higher than 60 Hz will give you a bigger safety margin against those other possible influences that may add a minor delay.

Also of notice is the fact that in some applications, a delay of 100 ms might seem unnoticeable, but it is in fact noticeable since it corresponds to 10 Hz, and if you would play a movie at that refresh rate, you would most likely realize the gaps between each of the movie's frames. For this reason, this value should not really be considered in a generic enough context.

The human eye's sensitivity is different for different conditions and portions of an image, so you should be careful and consider higher refresh rates as necessary, to accommodate this.

This link has further information about how the screen characteristics and their changes are perceived by the human eye, and may give you an idea of which refresh rates you should aim for in a given context, based on the visual impact of your script.


As a general rule, I find that anything quicker than 100ms tends to be perceived as "instant". Go much longer than that and the delay definitely becomes noticeable. Of course this will vary a bit from person to person, and also depending upon the context in which the delay is occurring.

You may find this example helpful: http://jsfiddle.net/QGmBy/


I heard of a rule of thumb that 100 ms is fast enough. I'll try to find a link...

Edit: What is the shortest perceivable application response delay?


If the event is occuring just once then 100ms should be the higher limit. if the event is a part of a continious movement, then something about 10-15ms should be it because a 100 ms delay in something like sliding stuff (a [one or more] pixel a time) can be noticeable if such delays occour in a row following each other.

Also it somewhat depends on the context, what is being delayed. a key press event, something sliding in, a realtime event happening on some other machine, all of these will have different 'tolerances' levels :)

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜