开发者

Java - Fastest way, and best code to load a URL and get a response from the server

I was curious as to what was the best and FASTEST way to get a response from the server, say if I used a for loop to l开发者_如何学Coad a url that returned an XML file, which way could I use to load the url get the response 10 times in a row? speed is the most important thing. I know it can only go as fast as your internet but I need a way to load the url as fast as my internet will allow and then put the who output of the url in a string so i can append to JTextArea.. This is the code Ive been using but seek faster alternatives if possible

int times = Integer.parseInt(jTextField3.getText());

            for(int abc = 0; abc!=times; abc++){
                try {
                        URL gameHeader = new URL(jTextField2.getText());
                        InputStream in = gameHeader.openStream();
                        byte[] buffer = new byte[1024];
                    try {
                        for(int cwb; (cwb = in.read(buffer)) != -1;){
                            jTextArea1.append(new String(buffer, 0, cwb));
                        } 
                    } catch (IOException e) {}
                } catch (MalformedURLException e) {} catch (IOException e) {}
            }

is there anything that would be faster than this?

Thanks

-CLUEL3SS


This seems like a job for Java NIO (Non-blocking IO). This article is from Java 1.4 but still will give you a good understanding of how to setup NIO. Since then NIO have evolved a lot and you may need to look up the API for Java 6 or Java 7 to find out whats new.

This solution is probably best as an async option. Basically it will allow you to load 10 URLs without waiting for each one to be complete before moving on and loading an other.


You can't load text this way as the 1024 byte boundary could break an encoded character in two.

Copy all the data to ByteArrayInputStream and use toString() on it or read Text as Text using BufferedReader.


Use a BufferedReader; use a much larger buffer size than 1024; don't swallow exceptions. You could also try re-using the same URL object instead of creating a new one each time, might help with connection pooling.

But why would you want to read the same URL 10 times in a row?

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜