开发者

How do I check an entire website to see if any page in it links to a particular URL?

We have been hounded by an issue in our websites because web protection facility pages like ones from Norton keep on telling certain visitors in certain browsers that our websites are potential risks because we link to a certain http://something.abnormal.com/ (sample URL only).

I've been trying to scour the site page by page, to no avail.

My question, do you know any site that would be able to "crawl" into our website's pages and then ch开发者_开发知识库eck if any text, image, whatever in them links to the abnormal URL that keeps on bugging.

Thanks so much! :)


What you want is a 'spider' application. I use the spider in 'Burp Suite' but there are a range of free, cheap and expensive ones.

The good thing about Burp is you can get it to spider the entire site and then look at every page for whatever you want, whether it be something to match a regex or dynamic content etc.


If your websites consist of a small amount of static content pages, I would use wget to download all pages (ignoring images)

wget -r -np -R gif,jpg,png http://www.example.com

and then use a text search for the suspicious url on the result. If your websites are more complex, httrack might be easier to configure for a text-only download.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜