开发者

SQLite optimization

I consider to use SQLi开发者_开发知识库te in a desktop application to persist my model. I plan to load all data to model classes when the user opens a project and write it again when the user saves it. I will write all data and not just the delta that changed (since it is hard for me to tell).

The data may contain thousands of rows which I will need to insert. I am afraid that consecutive insertion of many rows will be slow (and a preliminary tests proves it).

Are there any optimization best practices / tricks for such a scenario?

EDIT: I use System.Data.SQLite for .Net


Like Nick D said: If you are going to be doing lots of inserts or updates at once, put them in a transaction. You'll find the results to be worlds apart. I would suggest re-running your preliminary test within a transaction and comparing the results.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜