开发者

Android reading a large file

I have a CSV file in the assets folder with more than 10000 lines of data. I want to insert this data into a SQLite database when the database is created. I cannot i开发者_JAVA技巧nclude a database because it is a very generic application and the model demands a CSV file instead. I don't want to read all 10000 lines of data and insert it from the memory in one stretch. How do I accomplish the task effectively and efficiently?


Just insert immediately once you've read the line. So, just don't store the lines in some arraylist in memory.

E.g.

while ((line = reader.readLine()) != null) {
    insert(line);
}

You might want to do this in a single transaction so that you can rollback whenever it fails halfway. In the JDBC side, you may want to consider PreparedStatement#addBatch()/#executeBatch() to execute the inserts in batches. An example can be found in this answer.

Update: as a completely different but more efficient alternative, you can also leave Java outside the story and use CSV import facility provided by SQLite.


the simplest and most efficient in this case is to read 20-30-50(or maybe 100?!) lines at a time from the file and insert into the database, however you also need to run a few tests to see what is the best number of lines that you can read in 1 go and then do it :-)


IMPORT filename tablename

(from http://old.nabble.com/How-can-I-load-big-file-into-a-sqlite-database--td19202500.html - like LOAD DATA in mysql).

Other: disable auto commit, auto flush; read 20 lines, commit, flush.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜