开发者

Best practice for filtering items client-side with JavaScript, hide or remove from DOM?

I have a relatively large dataset of items (a few thousand items) that I want to navigate by applying a number of filters client side in a web application. Applying the filtering logic itself is not an issue, the question is about which method to use for updating the table of matching results to get the best user experience. The methods I've come up with are:

  1. Setting the class of each row to hide or show it (using visibility: collapsed to hide it), and keeping the DOM element in the ta开发者_如何学运维ble.
  2. Keeping a DOM element for each data item, detaching/attaching it to the table to hide and show it.
  3. Just keep an abstract object for each data item, creating a DOM object on demand to show it.

Which one is likely to give the best user experience? Any other recommended method besides those I've listed already?


If the display area has fixed size (or at least a maximum size), and you must filter on the client-side, I would not create a DOM node for each item, but instead reuse a predefined set of DOM nodes as templates, hiding unnecessary templates depending on the number of results from the filter. This will drastically reduce the DOM nodes in the document which will keep your page rendering responsive and is fairly easy to implement.

Example HTML*:

<ul id="massive-dataset-list-display">
    <li>
       <div class="field-1"></div>
       <div class="field-2"></div>
       <div class="field-n"></div>
    </li>
    <li>
       <div class="field-1"></div>
       <div class="field-2"></div>
       <div class="field-n"></div>
    </li>
    <li>
       <div class="field-1"></div>
       <div class="field-2"></div>
       <div class="field-n"></div>
    </li>
    .
    .
    .
</ul>

Example JavaScript*:

var MassiveDataset = function(src) {
    var data          = this.fetchDataFromSource(src);
    var templateNodes = $("#massive-dataset-list-display li");

    // It seems that you already have this handled, but just for 
    // completeness' sake
    this.filterBy(someParam) {
        var filteredData = [];
        // magic filtering of `data` 
        this.displayResults(filteredData);
    };

    this.displayResults(filteredData) {
        var resultCount = filteredData.length;

        templateNodes.each(function(index, node) {
            // There are more results than display node templates, start hiding
            if ( index >= resultCount ) {
                $(node).hide();
                return;
            }

            $(node).show();
            this.formatDisplayResultNode(node, filteredData[i]);
        });
    };

    this.formatDisplayResultNode = function(node, rowData) {
        // For great justice
    };
};

var md = new MassiveDataset("some/data/source");
md.filterBy("i can haz filter?");

* Not tested. Don't expect copy/paste to work, but that would be cool.


Adding a class and using CSS to show/hide the element will probably be the fastest (coding and performance wise), especially with so much items.

If you want to go the DOM manipulation route, consider editing the DOM off-line. Cache the DOM tree in memory (a local variable), update all rows and replace the original DOM node. See http://www.peachpit.com/articles/article.aspx?p=31567&seqNum=5 for more information on this matter.


I've done a project that required filtering items on the location within a Google Maps 'viewport' and a min-max value slider (for those that are curious, it was for a real estate website).

The first version used an AJAX request to get all (server-side) filtered items, so every change in the filter requested new data. Then the JSON data was parsed to DOM nodes and added to the document. Also, in this case search-engine indexing of the items was not possible.

The second version also used an AJAX request, but this time only requested the filtered ids of the items. All items were present in the HTML with the unique ids and filtered items had an extra class name to initially hide them. Whenever the filter changed, only the filtered ids were requested and the item's class name accordingly updated. This significantly improved the speed, especially in Internet Explorer (which has the slowest JavaScript engine -of our supported browsers-!)...


I realize that it's not exactly what you're asking for, but since you opened the door for alternates...

Have you considered doing any filtering server-side? You could load in your results with AJAX if the user changes filtering options, and that way you're not loading thousands of rows of data into a browser when you might only display a portion of them. It will potentially save you and the visitor bandwith, though this will depend on how your site really gets used.

Basically, if you decide ahead of time what data you want to show, you don't have to go through the trouble of picking over what's there.

I understand that this may not fit your needs, but I offer it as a suggestion in case you were stuck on the idea of client-side.


DOM manipulation is just too slow for "a few thousand items". Assuming you have a really, really good reason why you aren't getting the server to do the filtering then the best solution I've found is to use client-side XSL transforms on data held as XML.

Transforms themselves are very quick even on reasonably large data sets. You would then ultimately assign the results to the innerHTML property of a containing DIV where you want the table to appear. Using innerHTML for large changes in the DOM is way quicker than manipulating the DOM with Javascript.

Edit: Answers to Justin Johnson's comments:-

If the dataset is that large, the XML is potentially going to be beastly large.

Note I already make the disclaimer in my first paragraph regarding the enlisting of the servers help here. There may be a case here to switch the design around and make sensible use of AJAX, or simply not attempting to show much data at once. However I'm doing my best to answer the question posed.

Its also worth considering that "beastly large" is at least function of bandwidth. In a well connected intranet web application bandwidth is not at such a premium. In addition I've seen and used implementations that build up and re-use cached XML over time.

Also, if XML is converted to a DOM object, how is this any better?

There is massive difference between the technique I propose and direct DOM manipulation by Javascript. Consider when code in javascript modifies the DOM the underlying engine has no way to know that other changes are about to immediately follow, nor can it be sure that the javascript will not immediately examine other properties of the DOM. Hence when a change is made to the DOM by Javascript the browser needs to ensure it updates all sorts of other properties so that they are consistent with a completed rendering.

However when innerHTML is assigned a large HTML string the browser can quite happily create a whole bunch of DOM objects without doing any recalculations, it can defer a zillion updates to various property values until after the entire DOM as been constructed. Hence for large scale changes innerHTML will blow direct DOM manipulation out of the water.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜