More on Data Pump

Few Data Pump limitations that I ran into and you probably want to know about:

1. If you are using Oracle 10.2, and you want to write your exports into a directory that is on an NFS mount, it must be a hard mount. In 10.1 soft mounts used to work, and the upgrade caused us a significant amount of headache.

You see, our exports directory sits on an old and slow netapp (R200), and our storage manager gives us terabytes of space there but without any promise of high availability. This is fine for exports, but you can’t afford to hard mount storage that doesn’t have high availability. Once it goes down, processes will hang with uninterruptable sleep and we will need to reboot the machine to get rid of them. So with 10.2 data pump we are stuck with either hard mounting the original directories and taking a risk, or negotiating with the storage manager for space on a high availability device. I consider using interruptible hard mount, which should give us the best of both worlds, but I didn’t test it yet. I hope Oracle plans to bring back the option of configuring data pump directories on soft mounts.

2. Importing statistics takes unreasonable amounts of time. We had to export and import one schema. Collecting statistics on this schema takes around 5 minutes. Importing the statistics using Data Pump took over 20 minutes. Oracle’s documentation fails to mention this little bit of information. I really hope this issue is fixed in 11g.

Work around: import with “exclude=statistics” flag and collect statistics later.

3. Data Pump doesn’t write errors to standard output. At one point we believed that Data Pump got completely stuck, until we checked the alert log and discovered that we ran out of space in the tablespace. Data Pump itself never mentioned this little problem.

4. Data pump can hang when another process collects statistics while Data Pump is trying to import statistics. Note that Oracle 10g has a daily job that collects statistics by default.

5. If you consider working with multiple threads when importing, you need to do your homework. Multiple Data Pump workers don’t always behave as you might expect. This article by Carol Palmer gives a good explanation of the issues involved and when using multiple workers may not have benefit and may even slow you down.

Don’t get me wrong, I love Data Pump. It is one of the best things that happened to my job this year, literally saving me hours of boring work. You just need to be aware of its few catches.


12 Comments on “More on Data Pump”

  1. >>Work around: import with “exclude=statistics” flag and collect statistics later.

    Thats quite true. All the documents and presentation on DP failed to mentioned about excluding STATISTICS.

    Thanks for the tip.

    I really like the DP over Original EXP/IMP.

    I like the most re-startable feature of DP. I can stop, start and look the progress/status of the job.

  2. prodlife says:

    Hi Jaffar,

    There’s no doubt that DP is a huge improvement over traditional EXP/IMP. You don’t have to clean up and start over after IMP ran out of space!

    However, there is also no doubt that DP is still a bit buggy.
    I just ran into another nasty bug – in standard edition, you can successfully export the data using DP, but when you import you get:
    ORA-39125: or ORA-439: feature not enabled: Streams Capture.
    The only work around is to use the old EXP/IMP.

    Nasty. Especially if you used DP for backups…

    BTW. I’ve read your posts in OTN many many times and found them very useful. I’m rather excited to see you commenting in my blog 🙂

  3. cheryl says:

    Wow – you just saved me with this comment:
    “Data Pump doesn’t write errors to standard output. At one point we believed that Data Pump got completely stuck, until we checked the alert log and discovered that we ran out of space in the tablespace. Data Pump itself never mentioned this little problem.”

    My import was running for an hour with no output – I thought it was doing something. Then I started to get bored and google searched on slow import and found this. I checked my alert log and sure enough I exceeded space quota on a tablespace. NICE! all better now.

  4. prodlife says:

    Thanks for the kind feedback Cheryl!

  5. Lei Zeng says:

    Also want to mention two data pump bugs in RAC environment which frustrated me enough

    438608.1 Ora-31671 ORA-12801 ORA-31626 : Expdp failes When Parallel Parameter used in RAC

    454639.1 ORA-31626 ORA-31650 While Using DataPump Export in RAC Environment

  6. prodlife says:

    Thanks Lei!

  7. Alberto Tang says:

    Oracle datapump did not worked for us because of few critical bugs and too slow CLOB exp/imp. I spent a lot time experimenting with performance and found that direct import with sqlldr is much faster compared to datapump imp. The only problem was that sqlldr accept text format as an input. Our group ended up purchasing Fastreader from wisdomforce to unload data from oracle into text.

  8. prodlife says:

    Hi Alberto,

    I also had clob related problems.
    The preferred alternatives depend on the specific requirements…

  9. Phillip says:

    Prodlife – can you expound on the CLOB related issues… In my environment we have LOBs and find that it is really slow. Our schema is made up of 2 tablespaces DATA (85GB) INDEX (8GB) we have roughly 63K tables and (63K – 3) LOBS.

    Few of our tables (4K) have > 0 records in them.
    We’re in a 10g R2 RAC environment using NFS…

    We’re finding DataPump is slow, very slow…

  10. Phillip says:

    By 63K – I mean – roughly 63,000 tables
    and similarly 4K – I mean – roughly 4,000 tables

  11. Laurent says:

    can we say that EXPDP won’t work with 10gR2 RAC database?!

    In fact, I need to encrypt the exported dump file. There is no way to make an encryption in the EXP command while this can be done using the ENCRYPTION_PASSWORD attribute in EXPDP which may produce some errors and won’t work in RAC cases!!!

    Any suggestions?!
    Thanks in advance

    • prodlife says:


      In my experience EXPDP 10gR2 works great with RAC. However, I’m not sure about encryption (never used it), so you may have to check with Oracle support.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s