How to transfer big .txt file to MySQL?
I have a very big .txt file of millions of strings and it has separation sign. 开发者_Python百科
So, what is the easiest way to transfer all those strings to database? Is it PHP fopen and taking every string with explode and inserting it into database?
FILE IS 2 GB.
Use phpmyadmin to generate the command for you, ie.:
LOAD DATA LOCAL INFILE 'input_file' INTO TABLE `tablename` FIELDS TERMINATED BY '|' LINES TERMINATED BY '\n'
Quoted from http://vegdave.wordpress.com/2007/05/19/import-a-csv-file-to-mysql-via-phpmyadmin/
$mysql = mysql_connect("localhost", "user", "password");
mysql_select_db("database", $mysql);
$filename = "file.txt";
$handle = fopen($filename, "rb");
$contents = fread($handle, filesize($filename));
fclose($handle);
$array = explode('[separator]', $contents);
foreach ($array as $line)
{
mysql_query("INSERT INTO table VALUES (".mysql_real_escape_string($line).")", $mysql);
}
mysql_close($mysql);
Another solution, but boj' solution is better than this.
It depends how much memory you have available on the machine. If you are not limited by RAM size, you could try to read whole file, explode it into array and generate insert query in one go.
You will probably need to generate multiple insert queries because of DB limitations (e.g. max_allowed_packet
in mySQL)
If you don't have enough memory available you will need to read the file in more steps (by chunks of bytes). You can after generate some multiple insert queries.
fopen and fread functions might be helpful.
精彩评论