开发者

PostgreSQL COPY FROM Command Help

I have CSV File which is quite large (few hundred MBs) which I am trying to import into Postgres Table, problem arise when there, is some primary key violation (duplicate record in CSV File)

If it has been one I could manually filter out those records, but these files are generated by a program which generate such data ev开发者_如何学Goery hour. My script has to automatically import it to database.

My question is: Is there some way out that I can set a flag in COPY command or in Postgres so It can skip the duplicate records and continue importing file to table?


My thought would be to approach this in two ways:

  1. Use a utility that can help create an "exception report" of duplicate rows, such as this one during the COPY process.
  2. Change your workflow by loading the data into a temp table first, massaging it for duplicates (maybe JOIN with your target table and mark all existing in the temp with a dup flag), and then only import the missing records and send the dups to an exception table.

I personally prefer the second approach, but that's a matter of specific workflow in your case.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜