PostgreSQL COPY FROM Command Help
I have CSV File which is quite large (few hundred MBs) which I am trying to import into Postgres Table, problem arise when there, is some primary key violation (duplicate record in CSV File)
If it has been one I could manually filter out those records, but these files are generated by a program which generate such data ev开发者_如何学Goery hour. My script has to automatically import it to database.
My question is: Is there some way out that I can set a flag in COPY command or in Postgres so It can skip the duplicate records and continue importing file to table?
My thought would be to approach this in two ways:
- Use a utility that can help create an "exception report" of duplicate rows, such as this one during the COPY process.
- Change your workflow by loading the data into a temp table first, massaging it for duplicates (maybe JOIN with your target table and mark all existing in the temp with a dup flag), and then only import the missing records and send the dups to an exception table.
I personally prefer the second approach, but that's a matter of specific workflow in your case.
精彩评论