开发者

Taking dump of tables in oracle 10g using PL/SQL procedure

Hi Required immediate response,

I want to take dump of some selected tables from schema, can any body tell me is it possible? Can anybody provide procedure by executing that we can take dump.

e.g. I have schema, testuser, and tables (T1,T2,T3,T5,T9), i want to take dump of 开发者_开发知识库T1 & T5 only.

Thanks in advance


As you are on 10g you can do this with the Data Pump API. You need to have read and write access on a directory object which maps to the destination OS directory.

In the following example I am exporting two tables, EMP and DEPT, to a file called EMP.DMP in a directory identified by DATA_PUMP_DIR.

SQL> declare
  2      dp_handle       number;
  3  begin
  4      dp_handle := dbms_datapump.open(
  5      operation   => 'EXPORT',
  6      job_mode    => 'TABLE');
  7
  8    dbms_datapump.add_file(
  9      handle    =>  dp_handle,
 10      filename  => 'emp.dmp',
 11      directory => 'DATA_PUMP_DIR');
 12
 13    dbms_datapump.add_file(
 14      handle    => dp_handle,
 15      filename  => 'emp.log',
 16      directory => 'DATA_PUMP_DIR',
 17      filetype  => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
 18
 19    dbms_datapump.metadata_filter(
 20      handle => dp_handle,
 21      name   => 'NAME_LIST',
 22      value  => '''EMP'',''DEPT''');
 23
 24    dbms_datapump.start_job(dp_handle);
 25
 26    dbms_datapump.detach(dp_handle);
 27  end;
 28  /    

PL/SQL procedure successfully completed.

SQL> 

@DerekMahar asks:

"Is there a similar data pump tool or API available for execution from the client side"

DataPump, both the PL/SQL API and the OS utility, write to Oracle directories. An Oracle directory must represent an OS directory which is visible to the database. Usually that is a directory on the server, although I suppose it is theoretically possible to map a PC drive to the network. You'd have to persuade your network admin that this is a good idea, it is a tough sell, because it isn't...

The older IMP and EXP utilities read and wrote from client directories, so it is theoretically possible possible to IMP a local dump file into a remote database. But I don't think this is a practical approach. By their nature dump files tend to be big, so importing across a network is slow and prone to failure. It is a much better solution to zip the dump file, copy it to the server and import it from there.


You should try using the DATAPUMP api's (EXPDP/IMPDP). It has a lot more capabilities and has PLP/SQL APIs. DATAPUMP is a replacement for exp and imp and is supported in 10g.

http://www.orafaq.com/wiki/Datapump#Invoking_from_PL.2FSQL


With this command you'll get a binary Oracle dump: exp scott/tiger file=mydump.dmp tables=(T1,T5)

I recommend this link: http://www.orafaq.com/wiki/Import_Export_FAQ


If you must use PL/SQL, and you're trying to create a file, then you'll need to have a directory defined with write access granted to your user. That's something your DBA can do. See the "create directory" command.

At that point, you can (1) call UTL_FILE to open a file and write rows to it or (2) create an "EXTERNAL TABLE" and copy the information to it or (3) use DBMS_XMLGEN or (4) use any of several other ways to actually write the data from the database to the file. All of these are in the Oracle docs. The PL/SQL Packages and Types manual is your friend for things like this.

Note that the actual file system directory has to be on the server where the database is located. So you may need to get access to that server to copy your file, or have somebody set up a mount or whatever.

Alternatively, you could set up a plsql web service that you could call to get your data.

But, personally, I'd just use exp. Or, if that's not available, Toad or some other front end tool (even SQL*Plus) where you can just write a simple SQL script and save the results.

If you're doing this for a homework assignment, my guess is they'll want a UTL_FILE solution.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜