Faster way to import thousands of records from a single txt file into database MySQL?
I have lik开发者_JS百科e 50 txt files each with around 8 columns and 80,000 records. I have written a script in Matlab that reads all the files one by one and then puts it into MySQL using a single INSERT statement for each file. However doing so is taking a huge amount of time(even for a single file!). I have also tried using PHPmyAdmin which says the file is too big to upload (around 8 MB). So please suggest a workaround and what is the ideal way to usually import large txt files. Also how much time should it normally take to import 1 single such file?
All you need is in this manual: http://dev.mysql.com/doc/refman/5.0/en/insert-speed.html
Use LOAD DATA INFILE
syntax.
http://dev.mysql.com/doc/refman/5.1/en/load-data.html
Try mysqlimport
Also table type of myisam will import faster, depends on if you need transactional support (innodb).
Use multible INSERTS. It isn't that expensive, since you still connect to mysql only once. You can then use a loop to update your values ( -> your sql) and execute (i think it's "fetch" for mathlab) the query.
(see http://spx.arizona.edu/Projects/Database/Accessing%20mySQL%20through%20Matlab.pdf for example)
精彩评论