out of heap space memory error
I am trying to run the coreNLP package with the following program
package corenlp;
import edu.stanford.nlp.pipeline.*;
import java.io.IOException;
/**
*
* @author Karthi
*/
public class Main {
/**
* @param args the command line arguments
*/
public static void main(String[] args) throws IOException, ClassNotFoundException {
// TODO code application liogic here
String str="-cp stanford-corenlp-2010-11-12.jar:stanford-corenlp-models-2010-11-06.jar:xom-1.2.6.jar:jgrapht-0.7.3.jar -Xms3g edu.stanford.nlp.pipeline.StanfordCoreNLP [ -props <Main> ] -file <input.txt>";
args=str.split(" ");
StanfordCoreNLP scn=new StanfordCoreNLP();
scn.main(args);
}
}
I am not sure if the code itself is correct, but am getting the following error
Searching for resource: StanfordCoreNLP.properties
Searching for resource: edu/stanford/nlp/pipeline/StanfordCoreNLP.properties
Loading POS Model [edu/stanford/nlp/models/pos-tagger/wsj3t0-18-left3words/left3words-distsim-wsj-0-18.tagger] ... Loading default properties from trained tagger edu/stanford/nlp/models/pos-tagger/wsj3t0-18-left3words/left3words-distsim-wsj-0-18.tagger
Reading POS tagger model from edu/stanford/nlp/models/pos-tagger/wsj3t0-18-left3words/left3words-distsim-wsj-0-18.tagger ... Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
at edu.stanford.nlp.tagger.maxent.MaxentTagger.readModelAndInit(MaxentTagger.java:704)
at edu.stanford.nlp.tagger.maxent.MaxentTagger.readModelAndInit(MaxentTagger.java:649)
at edu.stanford.nlp.tagger.maxent.MaxentTagger.<init>(MaxentTagger.java:268)
at edu.stanford.nlp.tagger.maxent.MaxentTagger.<init>(MaxentTagger.java:228)
at edu.stanford.nlp.pipeline.POSTaggerAnnotator.loadModel(POSTaggerAnnotator.java:57)
at edu.stanford.nlp.pipeline.POSTaggerAnnotator.<init>(POSTaggerAnnotator.java:44)
at edu.stanford.nlp.pipeline.StanfordCoreNLP$4.create(StanfordCoreNLP.java:441)
at edu.stanford.nlp.pipeline.StanfordCoreNLP$4.create(StanfordCoreNLP.java:434)
at edu.stanford.nlp.pipeline.AnnotatorPool.get(AnnotatorPool.java:62)
at edu.stanford.nlp.pipeline.StanfordCoreNLP.construct(StanfordCoreNLP.java:309)
at edu.stanford.nlp.pipeline.StanfordCoreNLP.<init>(StanfordCoreNLP.java:347)
at edu.stanford.nlp.pipeline.StanfordCoreNLP.<init>(StanfordCoreNLP.java:337)
at edu.stanford.nlp.pipeline.StanfordCoreNLP.<init>(StanfordCoreNLP.java:329)
at edu.stanford.nlp.pipeline.StanfordCoreNLP.<init>(StanfordCoreNLP.java:319)
at corenlp.Main.main(Main开发者_如何学Python.java:22)
Java Result: 1
I tried giving these values in VM options in netbeans, but for each value i am getting error
-Xms3g
run:
Error occurred during initialization of VM
Incompatible initial and maximum heap sizes specified
Java Result: 1
BUILD SUCCESSFUL (total time: 0 seconds)
-Xmx3g
run:
Error occurred during initialization of VM
Could not create the Java virtual machine.
Could not reserve enough space for object heap
Java Result: 1
BUILD SUCCESSFUL (total time: 0 seconds)
-Xms3g -Xmx4g
run:
Could not create the Java virtual machine.
Invalid maximum heap size: -Xmx4g
The specified size exceeds the maximum representable size.
Java Result: 1
BUILD SUCCESSFUL (total time: 0 seconds)
Which OS are you running this on? Is it a 64 bit system? If not, then you are pretty much restricted when it comes to how much heap you can allocate to a single Java process. Try running with -Xms1024M -Xmx1024M
and see if it solves your issue.
try with the runtime parameters
java -cp -XX:+AggressiveHeap -jar jarfile
or
java -cp... -XX:MaxHeapFreeRatio=70 -XX:+UseLargePages -jar jarfile
精彩评论