开发者

Get website information with jQuery

Using jQuery, is there a way to get website information (possibly meta data and images)? An example of what I'm trying to accomplish is when I share a link on facebook it pulls up relevant images and show a blurb of text from the website (article).

Is there a way to accompli开发者_StackOverflowsh this easily using jQuery (with ajax)?


Well, I think you cannot achieve this by Javascript(jQuery) alone. Because of the same-origin policy of AJAX calls you cannot request the information directly. But you could send an AJAX call to your own application with the url, so it would download the page in question and parse its meta tags and title. And return a response in the proper format.

Another idea would be to make the server actually browse the page and make a screenshot of it and maybe keep a database of scaled-down images for each domain, I don't know how computationally expensive it would be however, seems a lot worse than simply parsing.

See wkhtmltoimage


If you're trying to determine the optimal image and text to display given a URL, this is a difficult problem, especially considering most sites don't yet make use of HTML5 semantic markup.

I believe Facebook lets developers mark up their page to determine this. For a little more info, check out:

http://developers.facebook.com/docs/reference/plugins/like/

http://developers.facebook.com/tools/lint/

As for doing thing entirely with jQuery, I think Uku is correct-- I don't think you can scrape a page on a different domain. What I've done in the past is the same as he suggests: send an AJAX call to a PHP script on my server, which downloads the page, and returns it. You can then use either PHP or JS to parse it, but the question remains, what is the most efficient way of parsing it?

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜