开发者

How to Update the mysql database when large csv file gets modified

Initially, I created a database called "sample" and updated the data from massive size CSV file.

Whenever I have s开发者_运维百科mall changes in .csv file (some data are added/deleted/modified), I have to update this in database too. Always updating the entire .csv file (large) is not efficient.

Is there any efficient way to update the modified data from .csv file to database?


There's no simple way of doing this.

One plausible way would be to store the old version of the CSV somewhere, run a diff-program between the old and new version, and then use the resulting output to determine what has been updated, changed, or removed, and update the database accordingly.

This is however a bit unreliable, slow, and would take some effort to implement. If you can it would probably be better to adapt the source of the CSV file to update the database directly.


since you also want to delete entries that are not existing in the csv file anymore, you will have to load the complete csv file every time (and truncate the table first) in order to get a 1:1 copy.

For a more convenient synchonzation you probably will have to utilize some scripting language (php, python etc).

Sorry thats all I know...


In my experience it's almost impossible to consider a datafile that changes regularly as your "master data set": unless you can somehow generate a diff file that shows where the masterdata changed you will always be forced to run through the entire csv file, query the database to return the corresponding record and then either do nothing (if identical), insert (if new) or update (if modified). In many cases it will even be faster to just drop the table and reload the entire thing, but that can lead to serious operational problems.

Therefor, if it's at all possible for you I'd consider the database as the masterdata, and generate the csv file from there.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜