开发者

SAXParserFactory URL Timeouts

I have the following piece of code:

try{
            SAXParserFactory spf = SAXParserFactory.newInstance();
            SAXParser sp = spf.newSAXParser();

            /* Get the XMLReader of the SAXParser we created. */
            XMLReader r = sp.getXMLReader();

            //This handles the xml and populates the entries arr开发者_StackOverflow社区ay
            XMLHandler handler = new XMLHandler();


            // register event handlers
            r.setContentHandler(handler);
            String url = "http://news.library.ryerson.ca/api/isbnsearch.php?isbn="+ISBN;
            r.parse(url);

            return handler.getEntries();
        }

This code works fine most of the time, but there are several cases where a user can enter the isbn of a popular book with 100+ related ISBN's (such as harry potter for example). When that happens, the XML feed does not break, but it takes longer to load (can be up to 30+ seconds for extreme cases). When the page is loading, it never drops the connection, its just takes its time loading.

Is there a way to increase the timeout time for the function?

Thanks


//opens the URL as a stream, so it does not timeout prematurely
String u = new String("http://foobar/isbnsearch.php?isbn="+ISBN);
URL url = new URL(u);
InputStream stream = url.openStream();

r.parse(new InputSource(stream));
stream.close();

Solved this one myself by adding this in.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜