INSERT: inserting multiple rows at once, or one by one?
I'm inserting multiple rows in a table, and I get this message:
MySQL server has gone away
My Query:
INSERT INTO table
(a,b,c,d,e,f,g,h,i,j,k)
VALUES(1,2,3,4,5,6,7,8,9,10,11),(1,2,3,4,5,6,7,8,9,10,11), ...
ON DUPLICATE KEY UPDATE
c=VALUES(c),
d=VALUES(d),
e=VALUES(e),
f=VALUES(f),
g=VALUES(g),
h=VALUES(h),
i=VALUES(i),
j=VALUES(j)
Is it because I stuffed too many values inside a single query? (There are like 5000 pairs of values from a array which I implode with ,
).
If this is the reason - then should I insert each row one by one? Is it slower than inserting them all at once?
The PHP code:
foreach($data as 开发者_如何学Python&$entry)
$entry = "('".implode("','", array(
$entry->ID,
addslashes($entry->field_1),
addslashes($entry->field_2),
...
))."')";
$data = implode(',', $data);
$query = "... VALUES{$data} ON ..."
$data
is a array of STD type objects...
edit again :)
So I tried splitting my $data
into smaller arrays of 100 elements each:
$data_chunks = array_chunk($data, 100);
foreach($data_chunks as $data_chunk)
insert_into_db($data_chunk);
and it works, I don't get that error anymore...
So that means the issue was the very long query string...
Now I'm even more confused:
- Is there a length limit of the query, or maybe PHP arguments in general?
- Is there any difference between inserting row by row than inserting multiple rows? Is it worth the array_chunk() ?
it could be that your query is taking too long to complete, mysql times out and closes the connection. You can alter the system variables to wait longer.
http://dev.mysql.com/doc/refman/5.0/en/gone-away.html
I think your problem is with *max_allowed_packet*, although the error seems to point in different direction. Try doing as suggested here: http://dev.mysql.com/doc/refman/5.5/en/packet-too-large.html
Or, before making any changes to mysql configuration, simply strlen() your query and find out how long(in bytes) it actually is.
精彩评论