开发者

How can I optimize the import of this dataset in mysql?

I've got the following table schema:

CREATE TABLE `alexa` (
  `id` int(10) unsigned NOT NULL,开发者_C百科
  `rank` int(10) unsigned NOT NULL,
  `domain` varchar(63) NOT NULL,
  `domainStatus` varchar(6) DEFAULT NULL,
  PRIMARY KEY (`rank`),
  KEY `domain` (`domain`),
  KEY `id` (`id`)
) ENGINE=MyISAM DEFAULT CHARSET=latin1

It takes several minutes to import the data. To me that seems rather slow as we're only talking about a million rows of data.

What can I do to optimize the insert of this data? (already using disable keys)


Use LOAD DATA INFILE or the equivalent command-line tool mysqlimport. This can be 20x faster than any other method.

You can also read Speed of INSERT Statements in the MySQL manual. This has a lot of tips for improving bulk insert performance.

You don't say much about how you are currently inserting data (besides disabling keys), so it's hard to recommend anything more specific about how you can improve that. For example, what coding language are you using? Are you using prepared queries?


Ensure bin log is off if you don't use replication:

set sql_log_bin=off;
0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜