开发者

how to import large datasets to mysql - the right way?

I have a 400MB *.sql file which is a dump from a MySQL database. I now want to import this to a new database. I used phpmyadmin but it breaks after a few minutes because it cannot handle such large dataset (max file size limit is reached). So I used the following:

logged in via SSH. selected the database via mysql tool. command: source myb开发者_运维百科ackup.sql

Now the queries run but it is running over 3 hours now. What is wrong or does it really take so long?

Thanks, WorldSignia


mysql databaseName < mybackup.sql

That is as fast as it gets, I think.


If it's not done already, you can use set autocommit=0 to turn off autocommit before you import. Otherwise InnoDB will COMMIT after every INSERT statement. It may help some.

You can also turn off secondary unique keys and foreign keys as well to help. See this link for more ideas:

http://dev.mysql.com/doc/refman/5.5/en/optimizing-innodb-bulk-data-loading.html

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜