How much data should one JSON call retrieve?
When getting a list of items via a json call is it better to use several small calls (to get info as it's needed) or one large call with all the data.
For exampl开发者_运维问答e, you have a json call to get a list of books matching a particular title keyword. There are 100 results. You're displaying the data in a paginated form - 10 results per 'page'. Is it more efficient to make one call and get all the results or to make a call for the next 10 on each page?
I would imagine it's partly determined by how many results there are. If it's some huge number the second option seems clear. But what is a good limit to the number you can get in one call - 100, 1000, 10,000 items?
Generally, each ajax call has an overhead and lowering the number of different calls makes the performance better.. unless the data is large...
In paging it is generally better not to fetch all data from the beginning because usually users don't move through all pages.. so you could lower the load on the server by not moving the data .... on another hand, if the data is relatively small or you believe the user will need to see all the data, fetch them to save the overhead of different calls ...
It depends.
Obviously, you want to keep the bandwidth usage to a minimum, but there is also an overhead to each individual call. You'll have to make some educated guesses, most importantly: how likely is it that you are going to need the data from pages 2 to 100?
If it is very likely (say, in 90% of the cases users are going to click through many pages of the same result set), then I'd download the whole result in one go, but otherwise, I'd load individual pages as you go.
Another thing to keep in mind is latency. Every ajax call has a certain latency, depending on the distance (in network topology, not necessarily geographical) between client and server. For the first load, the latency is inevitable, but after that, you need to ask yourself whether fast response is important. Under normal circumstances, it is expected and acceptable, but if your typical use case involves flipping back and forth between pages a lot, then it might become a nuisance, and you might consider buying snappiness for a longer initial loading time.
If you want to load multiple pages, but the result set is too large (say, thousands or millions of pages), you might think about more sophisticated schemes, e.g., download the requested page and the next 10, or download the requested page immediately and then prefetch the next 10 pages in the background.
精彩评论