Hibernate causes out of memory exception when saving large number of entities
In my application I'm using CSVReader & hibernate to import large amount of entities (like 1 500 000 or more) into database from a csv file. The code looks like this:
Session session = headerdao.getSessionFactory().openSession();
Transaction tx = session.beginTransaction();
int count = 0;
String[] nextLine;
while ((nextLine = reader.readNext()) != null) {
try {
if (nextLine.length == 23
&& Integer.parseInt(nextLine[0]) > lastIdInDB) {
JournalHeader current = parseJournalHeader(nextLine);
current.setChain(chain);
session.save(current);
count++;
if (count % 100 == 0) {
session.flush();
tx.commit();
session.clear();
tx.begin();
}
if (count % 10000 == 0) {
LOG.info(count);
}
}
} catch (NumberFormatException e) {
e.printStackTrace();
} catch (ParseException e) {
e.printStackTrace();
}
}
tx.commit();
session.close();
With large enough files (somewhere around 700 000 lines) I get out of memory exception (heap space).
It seems that the problem is somehow hibernate related, because if I comment just the line session.save(current); it runs fine. If it's uncommented, the task manager shows continuously increasing memory usage of javaw and then at some 开发者_如何学JAVApoint the parsing gets real slow and it crashes.
parseJournalHeader()
does nothing special, it just parses an entity based on the String[]
that the csv reader gives.
Session actually persists objects in cache. You are doing correct things to deal with first-level cache. But there's more things which prevent garbage collection from happening.
Try to use StatelessSession instead.
精彩评论