How to partition a table in MySQL which has 2 million records
I having a table which contains around 2 million records, all of which are for same node. I cannot partition based on date, as data is continually inserted and deleted each day.
The remaining data is in string format.
Every 15 minutes, the application takes backups from this table (only partial data which was collected for that 15 minutes); while doing this operation, the table is locked and unable to be accessed.
What kind of partitioning can I use to help sol开发者_如何学编程ve this problem?
If the problem you're trying to solve is the locking by the backup, partitioning won't help solve that.
Also, there are better solutions to avoid locking during backups. These solutions work equally well with or without partitioned tables.
If you use InnoDB tables (which you should), you can dump data with transaction isolation instead of locking.
mysqldump --single-transaction mydatabase mytable ...
Read the manual for more information: https://dev.mysql.com/doc/refman/5.7/en/mysqldump.html
You can also use Percona XtraBackup to create a physical backup instead of a data dump. This also depends on using InnoDB tables. Percona XtraBackup is free software.
精彩评论