开发者

effective importing of .csv to mysql to several related tables

Could you please tell me how to effectively allow the user to import their data in mysql? The problem is that data generally need to be inserted in several related tables. Importing importing .csv of seve开发者_C百科rl 10s or 100s thousand of line take much time and generate large load for database. Now I parse .csv, generate insets (maybe several inserts if we need to set attrributes in related table) and in loop insert data in tha database. How do you do such things? Maybe to load file on server and on the server periodacally to insert data by little portions? All ideas are appreciated. Thank you.


If you really need to insert all this data, I don't think you have much of a choice.

I would recommend using inserting multiple rows with one INSERT to reduce the number of round-trips between the application and the database:

INSERT INTO mytable (....)
VALUES (....),
       (....);

If you are inserting the data for the first time, you could create the indexes after all the data has been inserted, but of course if you're doing this online (i.e. your insertion process is executing concurrently with other operations) then you can't do that as the indexes are shared between all processes.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜