How to programmatically access websites through Java? Download HTML code, save cookies, fill and send forms etc
D开发者_如何学JAVAownloading a code of website is not a problem. However using this code not always is so simple.
I was wondering if there is any package, which makes it more easy to send HTTP GET/POST requests according to specified form. This will help perform a log-in to some websites, even using SSL.
I am sure that there are hundreds of programs which make such things possible. I was wondering what would be the best practices in this scope.
You can use HttpUnit to programmatically examine a Web page, follow links, explore tables, work with forms, etc. See examples in the HttpUnit Cookbook.
Note: HttpUnit is meant for unit-testing, but perhaps it's OK for your purposes too.
You might want to use the Java HttpUnit library.
It even handles cookies and JavaScript.
see Http Client from apache
You dont use html for this. you need some kind of a webService... or show the page in a webView but still if you want something more with it either it needs to have a callback url or something which you can capture and extract from the url some params to use.
By the way i see android in the tags. is this actually an android question or its more general? i can't really tell what exactly are you looking for here.
I found Selenium from Selenium AndroidDriver, but I didn't tested it yet.
精彩评论