开发者

get out of memory exception when indexing files with solrj

I write a simple program with solrj that index files but after a minute passed it crashed and the java.lang.OutOfmemoryError : java heap space appears

I use Eclipse and my memory storage is about 2GB and i set the -Xms1024M-Xmx2048M for both my VM arg of tomcat and my application in Debug Configuration and uncomment the maxBufferedDocs in solrconfig and set it to 100 then run again the application but it crash soon when it reaches the files greater than 500M开发者_如何学GoB

is there any config to index large files with solrj? the detail my solrj is as below:

String urlString = "http://localhost:8983/solr/file"; 
CommonsHttpSolrServer solr = new CommonsHttpSolrServer(urlString); 

ContentStreamUpdateRequest req = new ContentStreamUpdateRequest("/update/extract"); 

req.addFile(file); 
req.setParam("literal.id", file.getAbsolutePath()); 
req.setParam("literal.name", file.getName()); 
req.setAction(ACTION.COMMIT, true, true); 

solr.request(req); 


Are you also setting the heap size params when running the java class in eclipse ?

Run -> Run Configurations > <Class Name> > Arguments -> VM arguments


Is Solr also running on the same machine as solrj? There might be memory constraints on the machine where you are running Solr. How much free memory do you have once you start Solr? - you will probably need more memory available on that box.

Try to put a commit after every document and see if you can get around the problem temporarily.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜