开发者

mysql large query issue

Hey guys, i had posted this question in another question but i didnt get a helpful response. Partly because i dont think i explained myself properly. So im going to try again.

I've been programming a back end server in vb.net and its been using a mysql database to store information. I was up until a couple of days ago using a webhost's mysql server to do this.

I did not care to renew my webhost so ive moved everything to a home server to continue work on my program. I've got mysql 5.5 installed (which is a newer version then the one on my previous webhost) and everything is working perfectly except for one thing so far.

This program when starting up for the first time sends a query about a million table inserts large. the query looks something like "INSERT into blah VALUES(1,1,1,1,1);INSERT into...." and so on. This used to take about 5-10 mins or so on my webhost (which i had my server program running on my home server machine and it was sending the info via the net to the webhost's mysql.)

Now doing everything locally i was hoping for this to be faster, but it didnt really matter i just needed it to work. So when i send this query it just locks up for a minute or so and then returns a timeout. When i check the table in the database it has loaded exactly 1000 items everytime i try this.

Now im assuming this is some sort of setting issue, however i'v开发者_StackOverflowe played with my "my.ini" to see if it would help, it didnt. I had tried switching to some of the pre-packaged my-huge or my-innodb etc and those did not help either. I would assume that if anything it would just take longer or something, not just timeout immediately after 1000 inserts.

And just for some background info the server machine im using has a quad core core i7, 8gb ram, and a 1tb hd in it running windows 7.

Any help would be great thank you.


Probably your query is too large, you can tune MySQL behavior via max_allowed_packet option. You can also save some bytes by combining several insert queries into one, like this: INSERT INTO blah VALUES(1,1,1,1,1),VALUES(2,2,2,2,2),VALUES(3,3,3,3,3). But if this large combined query fails, then data in that insert will fail too. But in my opinion it's not the smartest way to do it. If your application should import huge sql dump on start, it can possible use mysql executable like this mysql -uroot -ppassword db_name < dump.sql and you're done. It will be possibly the most effective way to accomplish this task.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜