开发者

How to optimize the processing speed for inserting data using java?

I have a requirement to read an Excel file with its headers and data and create a table in a Database (MySQL) on the basis of header and put value which is extracted from file. For that I am using JDBC for creating and inserting data (used prepared statement) in DB Table.

It works nicely but when the number of records are increased -suppose file contains 20000开发者_JS百科0 or more records- it's going to be slow. Please guide me how I am able to optimize the processing speed of inserting data into an DB Table.

Thanks, Sameek


To optimize it you should first use the same PreparedStatement object in all of the inserts.

To further optimize the code you can send batches of updates.

e.g. batches of 5:

//create table
PreparedStatement ps = conn.prepareStatement(sql);
for(int i =0; i < rows.length; ++i) {
  if(i != 0 && i%5 == 0) {
    pstmt.executeBatch();
  }
  pstmt.setString(1, rows[i].getName());
  pstmt.setLong(2, rows[i].getId());
  pstmt.addBatch();
}
pstmt.executeBatch();


Wrap your inserts in a transaction. Pseudo code:

1) Begin transaction 2) Create prepared statement 3) Loop for all inserts, setting prepared statement parameters and executing for each insert 4) Commit transaction


I'll take the example of hibernate. Hibernate have a concept called HibernateSession which stores the SQL command that are not yet sent to DB. With Hibernate you can do inserts and flush the session every 100 inserts which means sending SQL queries every 100 inserts. This helps to gain performance because it communicates with database every 100 inserts and not each insert.

So you can make the same thing by executing the executeUpdate every 100 (or what ever you want) times or use preparedStatement.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜