开发者

SEO friendly javascript and CSS links?

Is it possible to have a navigation system optimized using javascript, but for the sake of search engines, have the hyperlinks still be crawlable?

Or maybe a condition 开发者_开发百科statement that calls HTML code only if javascript is not enabled in the browser or when crawled by a search engine?


What you are describing would be characterized by unobtrusive javascript.

see; http://en.wikipedia.org/wiki/Unobtrusive_JavaScript

You write your html in the most semantic SEO friendly way possible for search engines and users with javascript turned off, then add your script separately to add your bells and whistles.

A framework such as jQuery is often useful.

For example;

<a href="/about" id="about">About</a>

could be given another function via an external javascript file containing;

$("#about").click( function() {
    //fancy code here
    return false;
});

which would stop the user being taken to /about and execute the given javascript instead.

Essentially this is the inverse of your suggestion; rather javascript is only used if it's available to enhance the existing html.


Sure. In addition to being SEO-friendly, this approach is also far more accessible to handicapped users; if you work or may someday work in government or higher education you need to know about accessibility, though in fact everyone should be keeping this issue in mind.

Google "progressive enhancement" for more information; here's a good article.

Basically you want to create your site as if it were using normal link navigation, and then add javascript event handlers to hijack the clicks that would normally trigger navigation.


It's not easy to trigger an event if javascript is disabled because to run anything client-side you use javascript. What I do for my sites is to use static html links, and then use javascript to change what occurs when these links are pressed.

This way you can have a link somewhere, that is still crawlable and works fine if javascript is disabled, but if javascript is enabled use an AJAX method to reload parts of the page.


The suckerfish for example, are drop down menus based on nested HTML lists, turned into horizontal menus. It looks nice and clean and has fully crawlable links. Generally, it's better to generate HTML and then use progressive enhancement to turn the HTML into something nice via JavaScript.

On the other hand, if you generate the JavaScript navigation, for example as a JSON object, the it should be easy to generate an XML sitemap for Google.

What do you mean by "optimized"? Optimized for speed because your navigation tree is huge and would generate unnecessary HTML traffic? Then you should generate the navigation via JavaScript and Ajax calls to keep load times down and serve a sitemap to the search engines. If you mean "pretty" then use progressive enhancement.


Basically the main thing would be to add real urls in your href tags, and an onclick even handler to cancel the default.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜