开发者

Oracle SQL loader Error while loading really huge log files

I have a python script which loops through log files in a directory and uses oracle sqlloader to load the log files to the oracle database. the script works properly and even the sqlloader..

But after loading around some 200k records,the loading fails with this exception..

Record 11457: Rejected - Error on table USAGE_DATA.
ORA-12571: TNS:packet writer failure

SQL*Loader-926: OCI error while uldlfca:OCIDirPathColArrayLoadStream for table USAGE_DATA
SQL*Loader-2026: the load was aborted because SQL Loader cannot continue.
Specify SKIP=11000 when continuing th开发者_开发技巧e load.
SQL*Loader-925: Error while uldlgs: OCIStmtExecute (ptc_hp)
ORA-03114: not connected to ORACLE

SQL*Loader-925: Error while uldlgs: OCIStmtFetch (ptc_hp)
ORA-24338: statement handle not executed

I am not sure why this is hapenning.. I have checked the data files corresponding to the table's table space and it has auto extend set to true. What else could be the reason?

in the "sqlldr" command i have rows=1000 and Direct=True, so it commits for every 1000 records loaded, i have tested by varying this number, still getting same error.

sqlldr arisdw/arisdwabc01@APPDEV24 control=Temp_Sadish.ctl direct=true rows=1000 data=C:/_dev/logs/sample/data/mydata1.csv;


Post your controlfile contents. What version of Oracle are you using?

The ORA-42338 error is the one I would focus on. Are you doing any sort of data transformation in your job? Calling functions or similar?

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜