-
Notifications
You must be signed in to change notification settings - Fork 19
OCP ‐ The scp for a database
The ocp utility is modelled over the scp utility and it can be used to copy a file between your system running ocp and a directory in a database. As simple examples, ocp -l admin/{password}@//host/service myfile.txt DIRECTORY:
will copy the local file called myfile.txt to the named directory in the database, and similarly ocp -l admin/{password}@//host/service DIRECTORY:thatfile.zip .
will copy the file called thatfile.zip from the named database directory to your local system. The database can be an Autonomous Database or an on premise database.
In addition to copying to/from your local system from where you execute the ocp command, you can copy files between an Autonomous Database and a cloud bucket.
In order to copy a file in either direction between your local file system and a database directory, a helper package must be installed in the database. It is recommended to perform the installation as a DBA as that will make the package available to all users. It can alternatively be installed for a non-DBA in which case only that user has access to it.
To prepare the database by installing the rwl_ocp package, call ocp with the -l option to connect to the database and the --create-ocp option to install the package. A sample call and output is:
RWP*OCP Release 3.2.1.0 Production on Thu, 24 Apr 2025 09:05:32 UTC
Connected default database to:
Oracle Database 21c Enterprise Edition Release 21.0.0.0.0 - Production
Package rwl_ocp created
Public synonym rwl_ocp created and granted```
This needs to be done only once per database.
### List available database directories
To list the database directories that are available to some database user (not necessarily a DBA), call ocp with the --directories option.
A sample call and output is:
ocp -l pdbadmin/{password}@//server/host --directories
RWP*OCP Release 3.2.1.0 Production on Thu, 24 Apr 2025 09:13:41 UTC Connected default database to: Oracle Database 21c Enterprise Edition Release 21.0.0.0.0 - Production directory path
RWLTEMP /tmp TMP /tmp ORACLE_BASE /scratch/bengsig/app ORACLE_HOME /scratch/bengsig/oh2116h DATA_PUMP_DIR /scratch/bengsig/app/admin/r2116/dpdump/27FD17E4953A
## Preparing upload to an Oracle Cloud Infrastructure bucket
If your database is an autonomous database in Oracle Cloud, you do not have access to directories on the database server, including the DATA_PUMP_DIR above. To make the dump accessible it must therefore first be uploaded to a bucket in your region and tenancy. To do this, there is some preparation that must be done for your tenancy, user and region, which involves these steps:
## Creating Auth Token and adding it as credentials
In order to copy an awr dump file from a database directory to a cloud bucket, you need to have an _Auth Token_. You may already have an one, which you can use to create the necessary cloud credential.
If you do not have an Auth Token, you can create one from your cloud console: Go to your own user under Identity->Users and select the option to create an Auth Token. Note that you can only have very few Auth Tokens and that the token identifier (a string of some 20 random characters) _will only be displayed once_, so you need to save it safely.
You subsequently need to add your Auth Token as a credential in the ADMIN account of your autonomous database, which is done using dbms_cloud.create_credential. A typical call to do this would be to execute something like the following using sqlplus:
begin dbms_cloud.create_credential ( 'credname' , username=>'first.last@example.com' , password=>'abc.123-defgji>XYZ(' ); end;
where the password is the contents of the Auth Token and the username is your email address. If you have access to multiple autonomous databases in the same tenancy, you need to add the Auth Token by calling dbms_cloud.create_credentail once for each database.
## Creating a bucket
From your OCI console, go to the Object Storage and create a bucket. This step needs to be repeated for multiple regions, but the same bucket can very well be used for multiple databases in the same region.
## Uploading the dump file to a bucket
A bucket is identified by a total of four values: The name of your cloud tenancy, the name of the region, the name of the credential created in the previous step, and the actual bucket name. All these must be provided to awrdump in order to do the actual upload. A sample call to both create a dump and to upload it is:
awrdump -l username/{password}@database_tp --dbid=2286054548 --begin-snap=96950 --end-snap=97047
--directory=DATA_PUMP_DIR
--credential=credname --region=us-phoenix-1
--tenancy=mytenancy --bucket=awrbucket
RWP*AWR Dump Repository Release 3.0.3.17 Development on Thu, 24 Nov 2022 11:01:42 UTC
Connected default database to:
Oracle Database 19c Enterprise Edition Release 19.0.0.0.0 - Production
Dump file 2286054548_96950_97047.dmp of size 140652544 (134.1MiB) found in directory DATA_PUMP_DIR
Copying 2286054548_96950_97047.dmp to https://swiftobjectstorage.us-phoenix-1.oraclecloud.com/v1/mytenancy/awrbucket/2286054548_96950_97047.dmp
If you already have created the dump file and you just want to upload it to a bucket, add the --copyonly option which causes the first part with the actual dump creation to be skipped.
You can finally use the cloud console to identify the file and either download it or create a pre-authenticated URL for it.