jQuery autocomplete - optimization question
would love your thoughts on this one.
I am using jQuery autocomplete in a ASP.NET MVC application to retrieve a list of records from a database - I would like to know if there is a better way than the way I am currently doing it - as follows:
Here is the jQuery (I am calling an action method 'GetRecordByName' with the autocomplete)
$('#tb_name').autocomplete({
source: 'Home/GetRecordByName',
minLength: 1, delay: 0,
select: function (event, ui) {
// do stuff
}
});
On the back end, I am using Linq to Entities in the method to obtain the results from the database:
public JsonResult GetRecordByName(string term)
{
var data = records
.Where(dr => dr.Key.StartsWith(term))
.Select(dr => new { dr, value = dr.Key })
.Take(5);
return Json(data, JsonRequestBehavior.AllowGet);
}
Currently, it's running a bit slow (~ 1-2 seconds). Basically, the DB has tens of thousands of records in it and each time you type a character in the text box, the app hits the database. I can set it to 2 or 3 characters, but it doesn't change the speed, nor do I want to require that many characters.
I have an idea on how to optimize it: On pageload, get all possible results from the DB and send it to the client (via a hidden input or a javascript data structure), and have autocomplete use that client side data as the source. I'm sure that would be lightning fast, but it seems weird to me - are there are any drawbacks to doing it that way?
Are the开发者_如何学编程re any other ways to achieve a faster autocomplete?
UPDATE: Ok, apparently the problem was the way I was constructing my EF call. The EF call off of 'records' ultimately wasn't constructing a WHERE clause in the sql query; it was getting the whole table each time, and then the linq methods were going off that already-enumerated mass of data - this was occurring each time a letter was typed - causing the slowdown. Doh! I moved the .Where method to the data repository layer where it is enumerating after the where filters the results, and returning a Dictionary - seems to work great now. Thanks for all your help! I marked the most helpful answer which helped me investigate the issue.
As far as the Flickr solution / bringing the whole set of data to the client, I still feel like that's weird / overkill, but I'm sure it's warranted with certain large datasets. Also will be looking more into OutputCache for other things.
Thanks again!
As with any optimization, first you must test to figure out where the bottleneck is. Otherwise your efforts will be focused in the wrong areas. For example, use Firebug to see how long the actual request is taking. Use a Stopwatch in your MVC action to see how long the actual data retrieval is taking (throw a call to ToList
at the end of your query to make sure it evaluates in-method).
Until we know where the slow-down is, it's impossible to give you a really good suggestion for optimization, but here are a couple of ideas that come to mind:
- Use SQL Server Management Studio to analyze the execution path of the query produced by LINQ to Entities. Perhaps adding an index of some sort on the text column would make the database return results more quickly?
- There is a DNS issue in Windows that often causes non-IE browsers to go very slowly against development environments. If this works quickly in IE, but not in Firefox, you may just need to tweak your Hosts file.
On pageload, get all possible results from the DB and send it to the client (via a hidden input or a javascript data structure), and have autocomplete use that client side data as the source. I'm sure that would be lightning fast, but it seems weird to me - are there are any drawbacks to doing it that way?
Yes, there are drawbacks. You're saying there are tens of thousands of possible results in the database: enough that you suspect the database is taking a couple of seconds to return a few results from your query. Loading all of those entries into the page with the initial page load will add an enormous performance hit on the initial page load. It will probably make your page size a few 100k bigger than it needs to be, which makes a big difference for users with slow internet connections.
1) It really depends on your usage and your data but I recommend using [OutputCache]
. You can avoid executing a database query every time any user invokes the same controller action. Refer this for more details.. It's only going to be for each user, if you use caching at the controller action level, it'll cache one for all users.
2) look at the answer to this question: How to improve performance of Jquery autocomplete
"This widget downloads a list of all of your contacts, in JavaScript, in under 200ms
(this is true even for members with 10,000+ contacts). In order to get this level of
performance, we had to completely rethink how we send data from the server to the client."
3) I have seen people using Redis. It requires some set up but it is efficient.
Redis autocomplete
efficient serverside autocomplete
The way you mention is not a bad idea - but like StriplingWarrior mentioned, it could affect your page load. A way around that is to do an async ajax call on your page load and in the success callback bind the data returned to the autocomplet.
This means the autocomplete won't work untill the results are returned, but what are the chances the use will go to the text box and type in < 2/3 seconds?
I'm not really sure of your setup, but one optimization path you could take would be to use redis to store all of your searchable params as a key/value pair. Just have a task setup so that its kept updated.
精彩评论