开发者

Fetching the entire web page from a specific URL using Java

Can I fetch the entire web page, including CSS and images, using Java? That is basically what happens when using "save as" action in a browser. I can use any free 3rd party library.

edit:

HtmlUnit library seems to be doing exactly what I need. This is how I use it to grab the entire web page:

WebClient webClient = new WebClient();
HtmlPage page = webClient.getPage(开发者_开发问答new URL("..."));
page.save(new File("..."));


Java has some built in functions that you can utilize to open a stream the external sources say a web server and request a page which would return you the source to the page. You would then need to parse the links to external images and css and requests and save them accordingly.

here is a link to an example of opening a stream to an external source being a website


maybe lobo browser help you. It is an open source free browser completely by java. It has some jar libraries that can be added to your project.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜