开发者

Paging: Storing Search Results

What's the proper way to deal with a web page that returns results of a search that could have differing results from one moment to another?

I.e. returning the que开发者_C百科ry the first time could contain different results from when the user clicks on page 2, and running the query again.

How do most people deal with this scenario? Generally I'm working with internal ASP.Net applications (where security/bandwidth aren't huge concerns), so I'll store the results in the ViewState, and on postbacks deal with that data opposed to querying the database.

What's the proper methodology for external WWW use? My best guess is to store the results in a temporary database table (not literally a temp table; I guess 'staging' might be more accurate), but I would think that table would get hammered quite a bit with inserts/deletes/etc, and I would think you'd need a process to clean the table up, which doesn't seem like a very elegant solution.

Am I anywhere close?


Most applications don't deal with it... they assume the results won't change enough to warrant some sort of caching mechanism. However, if you're working with highly-realtime data (like Twitter results, for instance), your paging links would most likely look like this:

?q=your+query&olderthan={last result shown}&limit=10

...where {last result shown} is the ID of the last result on the current page. This ID would be used to allow you to query for results older than the specified ID: SELECT * FROM table WHERE id < {last result shown}.


Although it can be very memory intensive, you could also consider storing the search results into the session and paging through the session rather than running a query each time.


How about a session? This is temp data specific to a session, so why not use session?

I haven't done asp in ages ( from before the .NET era :-) ) , but in PHP or rails you would use sessions for any temporary store like. You can use either local file system ( on the server) or DB as a session store. There are ways to optimize, so I wouldn't worry about DB being hammered, unless you have thousands of users. Also DBs are designed to handle thousands of inserts and deletions - any major DB - SQL server, MySQL, PostgreSQL shouldn't have any problem with it.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜