开发者

MySQL Advice : huge database constantly updated but want to also search results to a website

I currently have a database with about 5 million rows of data which each row is being updated with data daily. I am doing nearly 200,000 rows an hour of updates, so I have my tables as an InnoDB. Now I already know this is my master database, but I also need to report all this information back to a website for mostly SELECTS (basically very few updates / inserts / deletes).

My question is, what is likely the best option for me? A 12-24 hour delay in data is completely acceptable so should I have a Master -> Slave type situation? Should I dump the master database every 12 hours and simply re开发者_StackOverflow中文版place the website's version as needed sending the few updates / deletes to a SQL text file I execute first?

This is a project that will grow quickly over the next 12 months so I really want to make the best selection now. Our database is flying as we have an extremely powerful server for it, but I want to make sure once we launch the site, it's also optimized. An example is if we dump the db daily and replace the old one, then I got to consider the few updates that would happen. If a Master/Slave setup is more ideal for this, can you provide links to some docs that would help me best optimize my setup?


You can turn on bin logging (which you would have on with a master setup, anyway) then manually (or via a cronjob) FLUSH LOGS and ship the last binlog(s) to the read slave (say, every 12+ hours).

At this point, you can use standard mysql tools (mysqlbinlog) to read the log and import it to your read instance

mysqlbinlog /path/to/binlog | mysql -u -p

Make sure you read the notes here around issues with passing multiple binlogs to the mysqlbinlog command.

This method assumes the read host is synchronized with a full snapshot of the database prior to beginning this method.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜