Caching of HTML Output
I am pretty sure that I know the answer to this already, but I am interested to see if anyone has other ideas. We are working on a website to include a major redesign with mega menus. One of my top things in a redesign is to reduce the page download time as much as is possible. All my images, css and javascript are cached and that's good. However, the part that I am trying to work through is the html coding for the menu, and if there is a way to locally cache that within reason.
As a side note, I like to do things as pure CSS as possible (for SEO), and so that would include outputting the mega menus directly onto the HTML page. But at the same time, I know that if it takes a number of seconds for the page to download the html content at the top, well, then, we are probably going to be running some customers off there too. Maybe the best then would be to have JavaScript output the menus, but then you run into the couple of custo开发者_如何学JAVAmers that don't have Javascript enabled.
Right now the pages are about 30K for the menus, and I anticipate that doubling and maybe more when we do the redesign.
Do you have some thoughts for this issue? What would you see as the best way to tackle this?
Thank you!!
JMax
Honestly, 30K for an element is nothing in this day and age, with high-speed connections common and browsers effectively caching as necessary. People don't leave because of a second or two. It's when you have Flash movies that preload or crazy auto-starting videos that people get annoyed in a hurry.
I've got a similar application with a menu that's likely double that right now...let me add, it's not by choice, it's something I inherited and have to maintain for the time being. The menu is output simply in an unordered list and then I use Superfish and CSS to do the styling as necessary. There's an initial hit, but after that, caching kicks in and we're good to go. Even as crazy as it is, the load isn't prohibitive. Navigating it, however, is a mess. I'd strongly recommend against confusing the heck out of your user with so many choices, especially on a mega menu that can be a UI hurdle for disabled and older users. When you boil it down, the whole basis behind the "Web 2.0 movement" (I hate that term) is to minimize or cloak complexity.
If you're REALLY concerned about performance, start off with what you're loading. Limit your Javascript by combining files, especially those small Jquery files that tend to stack up. HTTP requests can severely impact a site, especially since they monopolize the loads initially. Similarly, combine small CSS files and optimize the rest via an online tool To reduce image loads, create sprites for your graphics so you're loading one file instead of many. Here's a tut on Sprites and a simple google search will give you dozens of sites that will build the sprite and css automatically. Load anything you can from CDN, such as Jquery, Prototype, etc (hopefully only one framework per site, because two or more is unnecessary)
If you're still out of hand, look at your graphics one more time. Could you take advantage of pure CSS or image repeating via CSS to reduce loads further? Have you optimized all the graphics? Could you tweak the design to take advantage of those tricks?
After all that, if you simply can't change the menu to be more friendly, start investigating options. However, I suspect you'll find better gains in the first couple of steps than you would from taking extreme measures on the menu.
You could either set HTTP caching for a javascript code file that generates the menu, or use ajax to insert a pre-generated HTML menu from another file (again with a long expiry date set on cache).
Both those solutions require javascript through. I can't think of another way to remove the menus from the HTTP traffic apart from an IFRAME (yuck).
30k is massive for plain HTML though - do you REALLY need such a huge navigation structure?
精彩评论