开发者

Bulk loading into PostgreSQL from a remote client

I need to bulk load a large file into PostgreSQL. I would normally use the COPY command, but this file needs to be loaded from a remote client machine. With MSSQL, I can install the local tools and use bcp.exe on the client to connect to the server.

Is there an equivalent way for Postgr开发者_运维百科eSQL? If not, what is the recommended way of loading a large file from a client machine if I cannot copy the file to the server first?

Thanks.


COPY command is supported in PostgreSQL Protocol v3.0 (Postgresql 7.4 or newer).

The only thing you need to use COPY from a remote client is a libpq enabled client such as psql command line utility.

From the remote client run:

$ psql -d dbname -h 192.168.1.1 -U uname < yourbigscript.sql


You can use the \copy command from psql tool like:

psql -h IP_REMOTE_POSTGRESQL -d DATABASE -U USER_WITH_RIGHTS -c "\copy 
  TABLE(FIELD_LIST_SEPARATE_BY_COMMA) from 'FILE_IN_CLIENT_MACHINE(MAYBE IN THE SAME 
DIRECTORY)' with csv header"


Assuming you have some sort of client in order to run the query, you can use the COPY FROM STDIN form of the COPY command: http://www.postgresql.org/docs/current/static/sql-copy.html


Use psql's \copy command to load data in sql:

$ psql -h <IP> -p <port> -U <username> -d <database>

database# \copy schema.tablename from '/home/localdir/bulkdir/file.txt' delimiter as '|' 

database# \copy schema.tablename from '/home/localdir/bulkdir/file.txt' with csv header
0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜