开发者

Getting the approximated time of a web page to load in Java

In a programme I am writing in Java, I generate random (existing) addresses. For each address, I want to calculate something. I don't want to do this calculation if the page is very big, because then it would take much time.

So I thought, that if I know approximately how much time it would take to get the information, 开发者_StackOverflow中文版I could determine whether to operate that calculation, or not.

Preciseness is not important; I wouldn't care if it takes a second more than requested (although I wouldn't want it to exceed around 5-6 seconds).

I am generating articles from Wikipedia, if it helps.

Thank you in advance.


If you mean what I think you mean, you could do an HTTP HEAD request for the resource. The webserver would then reply with the headers, but not the content. If it sends the Content-Length header then you know how big the page is.


There are several ways of doing it (if it's supported by server). For example using URLconnection, Apache HttpClient ...

  • for 1st you can use Connection.getContentLength();

  • for 2nd request header of page: client.execute(requestHead); then read Content-Length element from response

bad thing for any method is that sometimes server doesn't give any information about size so you can end up with value of -1

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜