Oracle Data Pump and OCI

Joe NgoJoe Ngo
4 min read

Oracle Data Pump is a powerful utility for moving data and metadata between Oracle databases. When combined with Oracle Cloud Infrastructure (OCI), it becomes an even more versatile tool for database administrators and developers. This guide will walk you through the process of setting up and using Data Pump with OCI, covering everything from credential creation to monitoring job status.

Table of Contents

  1. Setting Up OCI Credentials in Autonomous Database

  2. Installing Data Pump on OCI Oracle Linux

  3. Configuring the Oracle Wallet

  4. Executing Data Pump Export

  5. Working with Object Storage

  6. Executing Data Pump Import

  7. Monitoring Data Pump Execution Status

  8. Archiving Files on Object Storage

Setting Up OCI Credentials in Autonomous Database

Before you can use Data Pump with OCI Object Storage, you need to set up the appropriate credentials in your Autonomous Database. Here's how to do it:

BEGIN
  DBMS_CLOUD.CREATE_CREDENTIAL(
    credential_name => '<credential>',
    username => '<oci-username>',
    password => '<auth-token>'
  );
END;
/

NOTE: Currently, Data Pump CLI commands using OCI Object Storage only works with auth-token credentials. Credentials created using API Keys would fail to authenticate.

If you're running Data Pump operations through another user, make sure to grant the necessary permissions:

GRANT datapump_cloud_exp TO <schema_user>;
GRANT datapump_cloud_imp TO <schema_user>;
GRANT READ, WRITE ON DIRECTORY data_pump_dir TO <schema_user>;

Installing Data Pump on OCI Oracle Linux

To use Data Pump on an OCI Oracle Linux instance, you'll need to install the Oracle Instant Client. The installation process differs slightly depending on your Oracle Linux version.

For Oracle Linux 7:

sudo yum install oracle-instantclient-release-el7
sudo yum install oracle-instantclient-basic
sudo yum install oracle-instantclient-tools

For Oracle Linux 8:

sudo dnf install oracle-instantclient-release-el8
sudo dnf install oracle-instantclient-basic
sudo dnf install oracle-instantclient-tools

Configuring the Oracle Wallet

To connect to your Autonomous Database, you'll need to set up the Oracle Wallet:

  1. Unzip the wallet contents to a folder.

  2. Edit the sqlnet.ora file. Replace ?/network/admin with the location of the wallet contents.

Executing Data Pump Export

Now that everything is set up, you can perform a Data Pump export. Here's an example command that exports directly to OCI Object Storage:

export PATH=/usr/lib/oracle/21/client64/bin:$PATH
export TNS_ADMIN=<location of exploded wallet>

expdp <db-username>@<adb-service-name> filesize=5GB credential=<credential> dumpfile=https://objectstorage.<region>.oraclecloud.com/n/<namespace>/b/<bucket>/o/<file-prefix>%U.dmp logfile=<logname>.log directory=data_pump_dir

If you prefer to export to the database first and then transfer to Object Storage, use this command:

expdp <db-username>@<adb-service-name> filesize=5GB dumpfile=<file-prefix>%U.dmp logfile=<logname>.log directory=data_pump_dir

Then, transfer the file using DBMS_CLOUD.PUT_OBJECT.

Working with Object Storage

Downloading Files from Object Storage

To download files from Object Storage:

for i in $(seq -f "%02g" 1 <n>); do 
  curl -O -X GET -u '<oci-username>:<auth-token>' https://swiftobjectstorage.<region>.oraclecloud.com/v1/<namespace>/<bucket>/<file-prefix>${i}.dmp
done

Replace <n> with the number of dump files.

Uploading Files to Object Storage

To upload files to Object Storage:

for i in $(seq -f "%02g" 1 <n>); do 
  curl -X PUT -u '<oci-username>:<auth-token>' -T <file-prefix>${i}.dmp https://swiftobjectstorage.<region>.oraclecloud.com/v1/<namespace>/<bucket>/<file-prefix>${i}.dmp
done

Copying Files from ADB to Object Storage

You can copy files from your Autonomous Database to Object Storage using PL/SQL:

BEGIN
  DBMS_CLOUD.PUT_OBJECT(
    credential_name => '<credential>',
    object_uri => 'https://objectstorage.<region>.oraclecloud.com/n/<namespace>/b/<bucket>/o/<filename>',
    directory_name  => 'DATA_PUMP_DIR',
    file_name => '<filename>');
END;
/

Listing Objects in an Object Storage Bucket

To list objects in your bucket:

SELECT *
FROM dbms_cloud.list_objects(
  credential_name => '<credential>', 
  location_uri => 'https://objectstorage.<region>.oraclecloud.com/n/<namespace>/b/<bucket>/o/'
);

Executing Data Pump Import

To import data from Object Storage:

export PATH=/usr/lib/oracle/21/client64/bin:$PATH
export TNS_ADMIN=<location of exploded wallet>

impdp <db-username>@<adb-service-name> credential=<credential> dumpfile=https://objectstorage.<region>.oraclecloud.com/n/<namespace>/b/<bucket>/o/<file-prefix>%U.dmp logfile=<logname>.log directory=data_pump_dir

Monitoring Data Pump Execution Status

You can monitor the status of your Data Pump jobs using this SQL query:

SELECT
    opname,
    sid,
    serial#,
    context,
    sofar,
    totalwork,
    round(sofar / totalwork * 100, 2) "%_COMPLETE"
FROM
    v$session_longops
WHERE
    opname IN (
        SELECT
            d.job_name
        FROM
            v$session             s,
            v$process             p,
            dba_datapump_sessions d
        WHERE
            p.addr = s.paddr
            AND s.saddr = d.saddr
    )
    AND opname NOT LIKE '%aggregate%'
    AND totalwork != 0
    AND sofar <> totalwork;

Archiving Files on Object Storage

After your Data Pump operation is complete, you might want to archive the dump files to save on storage costs. Here's how to do it:

for i in $(seq -f "%02g" 1 <n>); do 
  oci os object update-storage-tier -bn dpexp --object-name <file-prefix>${i}.dmp  --storage-tier Archive
  oci os object rename  -bn dpexp --source-name <file-prefix>${i}.dmp --new-name "archive/<file-prefix>${i}.dmp"
done

This script changes the storage tier to Archive and moves the files to an "archive" folder in your bucket.

By following this guide, you should now have a solid understanding of how to use Oracle Data Pump with OCI, from setting up credentials to archiving your dump files. Remember to always test these operations in a non-production environment before applying them to your production databases.

1
Subscribe to my newsletter

Read articles from Joe Ngo directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Joe Ngo
Joe Ngo