开发者

Best way to do a MySQL large database export

I've usually use mysqldump to export a database. However, when the database is really large, it seems much faster and less intensive to just gzip the database files directly without involving the MySQL deamon and copying that to the other database.

eg:

tar -czvf {db_name}.sql开发者_如何学Python.tgz /var/lib/mysql/{db_name}

Is this a good method of doing this? What are the (dis)advantages?

I also read another post here that mentioned:

rsync /var/lib/mysql/ ...

Would it be a good option to just use rsync to keep a backup db in sync with the development db?


I've used rsync just fine with moving files around and using them on other boxes. I've never done it with the MYSQL running live, but i've restored from the files in /var/lib/mysql before with no problems. It's a good way to "copy" over databases for your "development" box. I suggest shutting down mysql, moving the files over then starting it back up again. That is how I've done it when necessary.

myqldump gives you nice neat SQL code though, good if you ever need to "tweak" something with sed along the way.

I'd have no worries about using rsync though. I use it for many purposes including pushing code updates out to client machines.


Re the first method you mention: I think mySQL's official mysqlhotcopy uses that, so it can't be entirely wrong.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜