Uploading data to mysql database parallelly
I have a mysql database and a table of size 74 GB. I am currently using mysql command to load this data into mysql database. It took more than 10 hours a开发者_如何学JAVAnd still running. Is there a way to load the data into the mysql database parallelly. One way would be to actually split the table data into multiple files and then call load on each of these files. But, it's more of a hack. Is there something that stackoverflowers follow?
Thank you.
Make sure your source file and your DB are stored on fast disks and are not fragmented. I have seen it worth changing database engines for the tables to speed up import and then changing them back after import. I would try with INNODB and MyISAM engines to see if one is faster.
Drop all indexes and add them back when you are done. You will have to reoptimize them anyways and it is much faster to perform it only once. When you add them back, combine them into one alter statement (it is faster).
Export your data into a convenient bulk load format. You can quite easily get 10,000 or more rows of database into a single row of your file.
PHPMyAdmin and MySQL Docs have settings you should use to delay as much as possible to speed up the read. Make sure nothing else is using this database and/or table. That can only slow things down.
Ensure that the encoding you are providing matches the encoding on the DB.
Transfer the file to the DB server and import it from there (don't import from another machine over the wire).
Turn off replication if you can and you are using it. Turn off logging if you can and you are using it. Binary logs will double the amount of data you have to write.
Parallel inserts into one table (for the speed purposes) is a nonsense.
And splitting tables is not too sensible way too.
You have to ensure there are no indexes in the table.
And it would be nice to have a source file on a separate disk/controller.
精彩评论