AWS RDS to OCI Migration –(Oracle Data Pump Based)

Introduction

Migrating Oracle databases from AWS RDS to Oracle Cloud Infrastructure (OCI) is a common requirement for organizations adopting OCI-native services, improving performance, or reducing licensing and operational constraints.

Unlike on-premise databases, AWS RDS restricts OS-level access, which changes how DBAs perform exports, file handling, and transfers. This runbook provides a step-by-step, production-tested migration approach using:

  • Oracle Data Pump (expdp/ impdp)
  • AWS RDS logical directories
  • AWS S3
  • OCI Object Storage
  • Secure file transfer using rcloneand OCI CLI

This guide focuses on schema-level migration, which is the most common and safest approach.

Migration Flow Overview

AWS RDS → Data Pump Export → S3 Bucket → OCI Object Storage → OCI Database Import

Step 1: Create a Logical Directory in AWS RDS

Purpose

AWS RDS does not allow OS-level directory creation.
Instead, Oracle provides the RDSADMIN package to create logical directories.

Command

EXEC rdsadmin.rdsadmin_util.create_directory(

  p_directory_name => 'MIGRATION'

);

Verification

SELECT directory_name, directory_pathFROM dba_directoriesWHERE directory_name = 'MIGRATION';

Note

  • The physical location is managed internally by AWS
  • DBAs interact only via the logical directory name

Step 2: Export Schema Using Data Pump (AWS RDS)

Purpose

Export required schemas from AWS RDS into dump and log files stored in the MIGRATION directory.

Command (Example)

expdp <DB_USER>/<PASSWORD>@<RDS_ENDPOINT>:1521/<SERVICE_NAME> \

  directory=MIGRATION \

  dumpfile=SCHEMA_EXPORT_YYYYMMDD.dmp \

  logfile=SCHEMA_EXPORT_YYYYMMDD.log \

  schemas=<SCHEMA_NAME>

Best Practices

Use PARALLEL for large schemas

Use FILESIZE to split large dump files

Run during low-activity windows

Monitoring

SELECT * FROM dba_datapump_jobs;SELECT * FROM dba_datapump_sessions;

Step 3: View Data Pump Log File Inside RDS

Purpose

Since OS access is restricted, logs must be read using RDS utilities.

Command

SELECT *FROM TABLE(

  rdsadmin.rds_file_util.read_text_file(

    p_directory => 'MIGRATION',

    p_filename  => 'SCHEMA_EXPORT_YYYYMMDD.log'

  )

);

Step 4: Delete a Single Dump or Log File

Purpose

Clean up old files before re-running exports.

Command

EXEC UTL_FILE.FREMOVE(

  'MIGRATION',

  'SCHEMA_EXPORT_YYYYMMDD.log'

);

 Step 5: Delete All Files in the MIGRATION Directory

Purpose

Clear all dump and log files in one operation.

Command

BEGIN

  FOR i IN (

    SELECT filename

    FROM TABLE(rdsadmin.rds_file_util.listdir('MIGRATION'))

    WHERE type = 'file'

  ) LOOP

    UTL_FILE.FREMOVE('MIGRATION', i.filename);

  END LOOP;END;/

⚠ Use with caution – this removes all files in the directory.

Step 6: View File Size and Timestamp

Purpose

Validate dump sizes and confirm successful export.

Command

SELECT filename,

       ROUND(filesize / (1024*1024*1024), 2) AS size_gb,

       mtimeFROM TABLE(

  rdsadmin.rds_file_util.listdir('MIGRATION')

)ORDER BY mtime DESC;

Step 7: Upload Dump File from RDS to AWS S3

Purpose

Move dump files from RDS internal storage to AWS S3 for portability.

Command

SELECT rdsadmin.rdsadmin_s3_tasks.upload_to_s3(

  p_bucket_name    => '<S3_BUCKET_NAME>',

  p_prefix         => 'SCHEMA_EXPORT_YYYYMMDD.dmp',

  p_s3_prefix      => '',

  p_directory_name => 'MIGRATION'

) AS task_idFROM dual;

Note

This operation runs asynchronously

Capture the returned TASK_ID

Step 8: Verify Upload Status to S3

Purpose

Confirm whether the upload completed successfully.

 Command

SELECT textFROM TABLE(

  rdsadmin.rds_file_util.read_text_file(

    'BDUMP',

    'dbtask-<TASK_ID>.log'

  )

);

Look for:

Task completed successfully

Step 9: Copy File from AWS S3 to OCI Object Storage

Purpose

Transfer dump files across cloud providers.

Command (Linux)

rclone --verbose copy \

  awsrcl:<S3_BUCKET_NAME> \

  ocircl:<OCI_BUCKET_NAME>

Notes

rclone profiles must be configured for AWS and OCI

Ensure adequate bandwidth for large files

Recommended for large-scale migrations`

Step 10: Download Dump from OCI Bucket to Local Server

Purpose

Move dump files from OCI Object Storage to the database server.

 Command

cd /path/to/backup/

oci os object bulk-download \

  -ns <OCI_NAMESPACE> \

  -bn <OCI_BUCKET_NAME> \

  --prefix "<DUMP_PREFIX>" \

  --download-dir .

Step 11: Create Tablespaces and Users in OCI

Tablespace Creation (Example)

CREATE TABLESPACE <DATA_TS>

DATAFILE '+DATA/.../data01.dbf'

SIZE 10G AUTOEXTEND ON NEXT 128M MAXSIZE 30G;

ALTER TABLESPACE <DATA_TS>ADD DATAFILE '+DATA/.../data02.dbf'

SIZE 10G AUTOEXTEND ON NEXT 128M MAXSIZE 30G;

User Creation

CREATE USER <SCHEMA_NAME>DEFAULT TABLESPACE <DATA_TS>

TEMPORARY TABLESPACE <TEMP_TS>;

ALTER USER <SCHEMA_NAME> IDENTIFIED BY <PASSWORD>;ALTER USER <SCHEMA_NAME> QUOTA UNLIMITED ON <DATA_TS>;

GRANT CONNECT, RESOURCE TO <SCHEMA_NAME>;

Ensure tablespace size exceeds dump size

Step 12: Import Schema into OCI Database

Command

impdp system/<PASSWORD>@<OCI_DB_SERVICE> \

  directory=DPDUMP \

  dumpfile=SCHEMA_EXPORT_YYYYMMDD.dmp \

  logfile=SCHEMA_IMPORT_YYYYMMDD.log \

  parallel=4

Post-Import Checks

SELECT username FROM dba_users;

SELECT COUNT(*) FROM dba_objects WHERE status = 'INVALID';

Post-Migration Validation Checklist

✔ Dump size validated
✔ Import completed successfully
✔ Invalid objects reviewed
✔ Application connectivity tested
✔ Performance validated

Conclusion

Migrating Oracle schemas from AWS RDS to OCI requires careful handling due to RDS restrictions and cross-cloud file movement.By using logical directoriesData PumpS3, and OCI Object Storage, DBAs can perform secure, reliable, and repeatable migrations without OS access on AWS RDS.
This runbook provides a cloud-agnostic, production-safe migration strategy suitable for DEV, TEST, and PROD environments.

 

Recent Posts