开发者

Recreating Postgres COPY directly in Python?

I have a block of data, currently as a list of n-tuples but the format is pretty flexible, that I'd like to append to a Postgres table - in this case, each n-tuple corresponds to a row in the DB.

What I had been doing up to this point is writing these all to a CSV file and then using postgres' COPY to bulk load all of this into the database. This works, but is suboptimal, I'd prefer to be able to do this all directly from pyth开发者_如何学Goon. Is there a method from within python to replicate the COPY type bulk load in Postgres?


If you're using the psycopg2 driver, the cursors provide a copy_to and copy_from function that can read from any file-like object (including a StringIO buffer).

There are examples in the files examples/copy_from.py and examples/copy_to.py that come with the psycopg2 source distribution.

This excerpt is from the copy_from.py example:

conn = psycopg2.connect(DSN)
curs = conn.cursor()
curs.execute("CREATE TABLE test_copy (fld1 text, fld2 text, fld3 int4)")

# anything can be used as a file if it has .read() and .readline() methods
data = StringIO.StringIO()
data.write('\n'.join(['Tom\tJenkins\t37',
                  'Madonna\t\N\t45',
                  'Federico\tDi Gregorio\t\N']))
data.seek(0)

curs.copy_from(data, 'test_copy')
0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜