JavaScript: Extracting webpage contents
Hi i want开发者_如何学Go to get extract href table links from a website. Is it possible to do.? If yes the please help me in this regards. I want to use javascript.
i have an external site in which i have a table with some data . Each data is given to a link. i need to extract that link from my page.
Not entirely sure what you mean by href table links, but if you want to get all href attributes, you can do this:
var arr = [],
as = document.getElementsByTagName('a'),
len = as.length;
while( len-- ) {
arr.push( as[len].getAttribute('href') );
}
or if you want them with the domain instead of just the path, you can use the href
property instead of getAttribute()
.
var arr = [],
as = document.getElementsByTagName('a'),
len = as.length;
while( len-- ) {
arr.push( as[len].href );
}
and to skip when there isn't an href
attribute, you can add this line to either while
block.
if( as[len].href ) {
// get the href
}
I'm not sure what you mean by 'table links' but you can use jQuery and the find operator $('body').find('a')
to return all the links within a webpage.
If it isn't your own website and you want to use Javascript to extract the links then you may want to look at the Greasemonkey extension for Firefox:
https://addons.mozilla.org/en-US/firefox/addon/greasemonkey/
If you are talking about adding functionlity to your own website for others to use then Greasemonkey isn't the thing you want.
精彩评论