Insert takes too long, code optimization needed
I've some code I use to transfer a table1
values to another table2
, they are sitting in different database.
It's slow when I have 100.000 records. It takes forever to finish, 10+ minutes.
(Windows Mobile smartphone)
What can I do?
cmd.CommandText = "insert into " + TableName + " select * from sync2." + TableName+"";
cmd.ExecuteNonQuery();
EDIT
The problem is not resolve开发者_开发技巧d. I'm still after answers.
1] You can set the following parameters in your connectionString
string connectionString = @"Data Source=E:\myDB.db3; New=true; Version=3; PRAGMA cache_size=20000; PRAGMA page_size=32768; PRAGMA synchronous=off";
which has its own limitations. check this link for details
The above will increase the cache size ultimately(cache_size & page_size) but you could lose some data in case of force shutdown of your system(synchronous=off).
2] You can wrap your insert statements inside a transaction like this
dbTransaction = dbConnection.BeginTransaction();
dbCommand.Transaction = dbTransaction;
// your individual insert statements here
dbTransaction.Commit();
3] There is one more trick which is well explained in this page. Check it out
Hope that helps
Cheers!
As far as I can tell, you are using two SQL statements per row - one to insert it, and then another when you update the entire table. Do you need to update the entire table, or just the rows you are inserting? If not, you could just insert the row with dirty
set to 0
in the first place.
You can also try changing your ad-hoc insert statement into a prepared/compiled/parametrized statement. In some database engines that provides a small speed boost.
There are a few options for improvment listed in the SQLite FAQ, here.
Finally, have you figured out what your bottleneck is? I don't know much about the performance of mobile phone applications - is your profiling showing that you are CPU bound or "disk" bound, whatever passes for a disk on that particular phone?
One suggestion, that may help (although you'd need to profile this), would be to call your insert or replace command a single time for multiple records, with multiple values. This would batch the calls to the DB, which potentially would speed things up.
Also, it sounds like you're copying from one DB to another - if the records are not going to exist in the new DB, you can use INSERT instead of INSERT OR REPLACE, which is (at least theoretically) faster.
Neither of these is going to be dramatic, however - this is likely going to still be a slow operation.
Instead of executing the statements individually you could build up the string then execute it at once as a batch, alternatively look into batch updates.
Alternatively you could do this all as one statement which would probably be more efficient, something like this is what I mean:
http://forums.mysql.com/read.php?61,15029,15029
I hope this helps.
Well you're using the query processor for each insert. It's going to be much better to use a command, add Parameters to it, Prepare it, and then just set the Parameters in your loop.
I don't know if SQLite supports TableDirect commands. If it does that would be way, way, way faster (it's a few orders of magnitude faster in SQL Compact for example). It would be certainly worth a try.
I think you could use a DataTable
(I think - or was it DataSet
? sorry, still a beginner in .NET), then .Fill()
the Reader
's results into that DataTable
, and then there is a BulkCopy
or BulkInsert
operation, that can push it all out into a table in another database (connection).
- Does the sql statement need also 10minutes+ if you do this directly in the SQL management studio?
- If not, did you try make it with a SQL procedure and execute it?
- Did you try setup the connection pool and/or make the cursor server sided?
- run the SQL profiler (you can call it from the management studio) if 1. is slow and add transaction queue there.
精彩评论