Output large xml from resultset
We have an application in which an XML string is created from a stored proc resultset and transformed using XSLT开发者_运维百科 to return to the calling servlet. This work fine with smaller dataset but causing out of memory error with large amount of data. What will be the ideal solution in this case ?
XSLT transformations, in general, require the entire dataset to be loaded into memory, so the easiest thing is to get more memory.
If you can rewrite your XSLT, there is Streaming Transformations for XML which allow for incremental processing of data.
If you're processing the entire XML document at once then it sounds like you'll need to allocate more memory to the Java heap. But that only works up to the defined maximum heap size. Do you know a reasonable maximum data set size or is it unbounded?
Why do you need the database to generate the XML?
Few important things to note.
You mentioned works fine functionally with small data-set but goes out of memory with large data sets. you need to identify whether its creation of datasets that causes out of memory or transfer of datasets in the same process.
You are doing something which is making many objects to stay in memory. Re-Check your code and nullify some objects explicitly after usage.This will make life easier for garbage collector. Play with MaxPermSize settings of JVM. This will give you additional space for strings.
This approach is going to have a limitation that even if you are able to transfer large datasets for single user this might go outOfMemory for multiple users.
A suggestion that might work for you.
Break this in an Asyncronous process.Make creation of large datasets separate process and downloading of that datasets a different process.
While making the datasets available for download you can very well control the memory consumption using stream based downloading.
精彩评论