Complex search workaround
Before anything, this is not necessarily a question.. But I really want to know your opinion about the performance and possible problems of this "mode" of search.
I need to create a really complex search on multiple tables with lots of filters, ranges and rules... And I realize that I can create something like this:
- Submit the search form
- Internally I run every filter and logic step-by-step (this may take some seconds)
- After I find all the matching records (the result that I want) I create a record on my
searches
table generating a token of this search (based on the search params) like 86f7e437faa5 and save all the matching records IDs - Redirect the visitor to a page like mysite.com/search?token=86f7e437faa5
And, on the results page I only need to discover what search i'm talking about and page the results IDs (retrieved from the searches
table).
This will make the refresh & pagination much faster since I don't need to run all the search logic on every pageview. And if the user change a filter or search criteria, I go back to step 2 and generate a new search token.
I never saw a tutorial or something about this, but I think that's wat some开发者_如何学Python forums like BBForum or Invision do with search, right? After the search i'm redirect to sometihng like search.php?id=1231 (I don't see the search params on the URL or inside the POST args).
This "token" will no last longer than 30min~1h.. So the "static search" is just for performance reasons.
What do you think about this? It'll work? Any consideration? :)
Your system may have special token like 86f7e437faa5 and cache search requests. It's a very useful mechanism for system efficiency and scalability.
But user must see all parameters in accordance with usability principles.
So generating hash of parameters on the fly on server-side will be a good solution. System checks existanse of genereted hash in the searches table and returns result if found. If no hash, system makes query from base tables and save new result into searches table.
Seems logical enough to me.
Having said that, given the description of you application, have you considered using Sphinx. Regardless of the number of tables and/or filters and/or rules, all that time consuming work is in the indexing, and is done beforehand/behind the scene. The filtering/rules/fields/tables is all done quickly and on the fly after the fact.
So, similar to your situation, Sphinx could give you your set of ID's very quickly, since all the hard work was pre-done.
TiuTalk,
Are you considering keeping searches saved on your "searches" table? If so, remember that your param-based generated token will remain the same for a given set of parameters, lasting in time. If your search base is frequently altered, you can't rely on saved searches, as it may return outdated results. Otherwise, it seems a good solution at all.
I'd rather base the token on the user session. What do you think?
@g0nc1n
Sphinx seems to be a nice solution if you have control of your server (in a VPS for example).
If you don't and a simple Full Text Search isn't enough for you, I guess this is a nice solution. But it seems not so different to me than a paginated search with caching. It seems better than a paginated search with simple url refered caching. But you still have the problem of the searches remaining static. I recommend you flush the saved searches from time to time.
精彩评论