Datapump export taking too much time to complete

dumpexpdpexportoracleoracle-11g-r2

I've scheduled to take a backup of my oracle11g database, daily at 11pm via Windows Task Scheduler. At this time, I'm not letting my database to go down. Also, my database is in noarchivelog mode. Although my dump size is 7GB only, this export is taking more than 11 hours to complete. It consists of 15 to 16 schemas. Even if I export these schemas only, it's taking too much time. As I'm a newbie in DB administration, I'm unable to dig deeper into the issue.

Following is the trace log at the time:

*** 2017-02-10 06:40:31.392
kcrfw_update_adaptive_sync_mode: poll->post current_sched_delay=275 switch_sched_delay=5635 current_sync_count_delta=14 switch_sync_count_delta=10

*** 2017-02-10 06:40:31.392
Log file sync switching to post/wait
Current approximate redo synch write rate is 4 per sec

*** 2017-02-10 06:42:02.923
kcrfw_update_adaptive_sync_mode: post->poll long#=2 sync#=6 sync=428 poll=33464 rw=4838 ack=0 min_sleep=33464

*** 2017-02-10 06:42:02.923
Log file sync switching to polling
Current scheduling delay is 111 usec
Current approximate redo synch write rate is 2 per sec
kcrfw_update_adaptive_sync_mode: poll->post current_sched_delay=0 switch_sched_delay=111 current_sync_count_delta=12 switch_sync_count_delta=6

What's wrong? What should I check to identify the cause for this slowness in datapump export and how I can expedite the expdp?

Related export file log after adding trace command :

With the Partitioning, OLAP, Data Mining and Real Application Testing options
;;; Legacy Mode Active due to the following parameters:
;;; Legacy Mode Parameter: "consistent=TRUE" Location: Command Line, Replaced with: "flashback_time=TO_TIMESTAMP('2017-02-10 23:45:00', 'YYYY-MM-DD HH24:MI:SS')"
;;; Legacy Mode Parameter: "direct=TRUE" Location: Command Line, ignored.
;;; Legacy Mode has set reuse_dumpfiles=true parameter.
Starting "SYS"."SYS_EXPORT_SCHEMA_01":  "sys/********@PROD AS SYSDBA" DUMPFILE=Backup_02102017.dmp LOGFILE=BakupLog_02102017.log SCHEMAS=O1,O2,O3,O4,O5,O6,O7 flashback_time=TO_TIMESTAMP('2017-02-10 23:45:00', 'YYYY-MM-DD HH24:MI:SS') ESTIMATE=STATISTICS trace=1FF0300 reuse_dumpfiles=true 
Estimate in progress using STATISTICS method...
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
.  estimated "O1"."T1"              471.6 MB
.  estimated "O2"."T2"              360.5 MB
.  estimated "O3"."T3"              295.5 MB
.  estimated "O4"."T4"              223.9 MB
.  estimated "O5"."T5"              137.0 MB
.  estimated "O6"."T6"              78.11 MB
Total estimation using STATISTICS method: 2.408 GB
Processing object type SCHEMA_EXPORT/USER
Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
Processing object type SCHEMA_EXPORT/ROLE_GRANT
Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
Processing object type SCHEMA_EXPORT/TABLESPACE_QUOTA
Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
Processing object type SCHEMA_EXPORT/SYNONYM/SYNONYM
Processing object type SCHEMA_EXPORT/TYPE/TYPE_SPEC
Processing object type SCHEMA_EXPORT/DB_LINK
Processing object type SCHEMA_EXPORT/SEQUENCE/SEQUENCE
Processing object type SCHEMA_EXPORT/SEQUENCE/GRANT/OWNER_GRANT/OBJECT_GRANT
Processing object type SCHEMA_EXPORT/TABLE/TABLE
Processing object type SCHEMA_EXPORT/TABLE/GRANT/OWNER_GRANT/OBJECT_GRANT
Processing object type SCHEMA_EXPORT/TABLE/COMMENT
Processing object type SCHEMA_EXPORT/TABLE/RLS_POLICY/RLS_POLICY
Processing object type SCHEMA_EXPORT/TABLE/TRIGGER
Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/USER_PREF_STATISTICS
. . exported "O1"."T1"              571.7 MB 16000532 rows
. . exported "O2"."T1"              369.9 MB 1512560 rows
. . exported "O3"."T1"              360.6 MB 9739091 rows
. . exported "O1"."T2"              239.0 MB 1834364 rows
. . exported "O1"."T3"              152.0 MB  967417 rows
. . exported "O1"."T4"              79.42 MB  208401 rows
. . exported "O3"."T4"              15.15 MB  119298 rows
;;; Ext Tbl Shadow: worker id 1.
. . exported "O5"."T6"              6.860 MB   35959 rows
;;; Ext Tbl Shadow: worker id 1.
.
. . exported "O6"."T1"              4.159 MB   52923 rows
. . exported "O7"."T2"              3.735 MB   69269 rows
;;; Ext Tbl Shadow: worker id 1.
ORA-31693: Table data object "O8"."PS_TXN" failed to load/unload and is being skipped due to error:
ORA-02354: error in exporting/importing data
ORA-01555: snapshot too old: rollback segment number  with name "" too small
ORA-22924: snapshot too old
. . exported "O4"."T6"              750.1 KB    4911 rows
;;; Ext Tbl Shadow: worker id 1.
.

Best Answer

By having your database is no archive log mode and doing a daily warm backup, you are saying that is acceptable to lose a day's worth of data. If that is the case then fine, you don't need archive log mode. Doing a data pump export will give you a logical backup of your database at a point in time. The next question that you should ask yourself is that if it takes 11 hours to export your data, how long will the import take. If you are in a recovery scenario and you need to import your data the process could take awhile. You would probably be better just doing RMAN warm backups.

Having said that, there are two thoughts that I have about your current issue. Is there a process that is running during the export that is changing and possibly locking data. If so your export won't be consistent and won't have all of the data that was changed for that day. For example if you are running an ETL process during the export you might get some of the data from the ETL, but not all of the data. The second thought is, if you are using enterprise edition, are you using more than one channel? You should be using multiple channels and compression. But I would guess that there is a locking issue that is causing most of your issue.

Related Question