-
Notifications
You must be signed in to change notification settings - Fork 19
OCP ‐ The scp for a database
The ocp utility is modelled over the scp utility and it can be used to copy a file between your system running ocp and a directory in a database. As simple examples, ocp -l admin/{password}@//host/service myfile.txt DIRECTORY:
will copy the local file called myfile.txt to the named directory in the database, and similarly ocp -l admin/{password}@//host/service DIRECTORY:thatfile.zip .
will copy the file called thatfile.zip from the named database directory to your local system. The database can be an Autonomous Database or an on premise database.
In addition to copying to/from your local system from where you execute the ocp command, you can copy files between an Autonomous Database and a cloud bucket.
In order to copy a file in either direction between your local file system and a database directory, a helper package must be installed in the database. It is recommended to perform the installation as a DBA as that will make the package available to all users. It can alternatively be installed for a non-DBA in which case only that user has access to it.
To prepare the database by installing the rwl_ocp package, call ocp with the -l option to connect to the database and the --create-ocp option to install the package. A sample call and output is:
ocp -l admin/{password}@//sample_tp --create-ocp
RWP*OCP Release 3.2.1.0 Production on Thu, 24 Apr 2025 09:05:32 UTC
Connected default database to:
Oracle Database 21c Enterprise Edition Release 21.0.0.0.0 - Production
Package rwl_ocp created
Public synonym rwl_ocp created and granted
This needs to be done only once per database.
To list the database directories that are available to some database user (not necessarily a DBA), call ocp with the --directories option. A sample call and output is:
ocp -l pdbadmin/{password}@//server/host --directories
RWP*OCP Release 3.2.1.0 Production on Thu, 24 Apr 2025 09:13:41 UTC
Connected default database to:
Oracle Database 21c Enterprise Edition Release 21.0.0.0.0 - Production
directory path
------------------------------ ----
RWLTEMP /tmp
TMP /tmp
ORACLE_BASE /scratch/bengsig/app
ORACLE_HOME /scratch/bengsig/oh2116h
DATA_PUMP_DIR /scratch/bengsig/app/admin/r2116/dpdump/27FD17E4953A
To copy a local file to the database, provide the source as a file name (possibly with a directory part) and the destination in the format DIRECTORY:file, where the part after the : can be omitted. If omitted the file will be named after the source file. Note that ocp will not overwrite an existing destination file unless the --overwrite option is given. A sample call and output is:
ocp -l pdbadmin/{password}@//server/host /tmp/file.txt TMP:
RWP*OCP Release 3.2.1.0 Production on Thu, 24 Apr 2025 09:23:39 UTC
Connected default database to:
Oracle Database 21c Enterprise Edition Release 21.0.0.0.0 - Production
Copying '/tmp/file.txt' to 'TMP:file.txt'
The command to copy from the database to the local file system is like the previous one, except it is the source file that has the DIRECTORY:file format. The destination can be a directory, a file possibly including directory parts or it can be empty in which case the current directory is used. As in the opposite direction, the --overwrite option is necessary to overwrite an already existing destination file. A sample call and output is:
ocp -l pdbadmin/{password}@//server/host --overwrite TMP:file.txt /tmp
RWP*OCP Release 3.2.1.0 Production on Thu, 24 Apr 2025 09:30:21 UTC
Connected default database to:
Oracle Database 21c Enterprise Edition Release 21.0.0.0.0 - Production
Copying 'TMP:file.txt' to '/tmp/file.txt'
If your database is an Autonomous Database, you can list the contents of a database directory using the --list-directory option:
ocp -l $(oltpconnect -A) --list-directory=DATA_PUMP_DIR
RWP*OCP Release 3.2.0.17 Development on Thu, 24 Apr 2025 09:48:57 UTC
Connected default database to:
Oracle Database 19c Enterprise Edition Release 19.0.0.0.0 - Production
Contents of database directory 'DATA_PUMP_DIR'
bytes last modified file name
10297344 2021.03.08 10:04:04 automl_metamodels.dmp
358 2021.03.08 10:04:04 load_OML4PY_70498.log
7594 2021.03.08 10:05:53 OML4PY_IMPORT_70500.log
1302 2022.01.06 18:02:29 SYS_TEMP_4FDA1D5BD_F3944CA3_330931.log
1302 2022.01.06 18:03:29 SYS_TEMP_4FDA1D5BD_F3944CA3_330911.log
611420 2022.03.17 07:34:57 OML4SQL Classification XGBoost - OLD.json
1300 2022.03.17 07:38:29 SYS_TEMP_1FD9D731A_D81E5FC2_32494.log
1302 2022.04.28 06:06:25 SYS_TEMP_1FDA1A2ED_72818156_161612.log
1300 2022.08.19 08:12:50 SYS_TEMP_0FDA02B8A_8B8A9E6B_32871.log
731 2022.11.16 00:14:21 SYS_TEMP_1FDA1937B_23E3327D_17650.log
36679680 2022.11.22 14:31:42 2286054548_97090_97097.dmp
147050 2022.11.22 14:31:47 2286054548_97090_97097.log
33013760 2022.11.22 14:41:50 2286054548_97093_97097.dmp
147050 2022.11.22 14:41:55 2286054548_97093_97097.log
139440128 2022.11.22 14:46:39 bkedump.dmp
147035 2022.11.22 14:46:43 bkedump.log
If your database is an autonomous database in Oracle Cloud, you do not have access to directories on the database server, including the DATA_PUMP_DIR above. To make the dump accessible it must therefore first be uploaded to a bucket in your region and tenancy. To do this, there is some preparation that must be done for your tenancy, user and region, which involves these steps:
In order to copy an awr dump file from a database directory to a cloud bucket, you need to have an Auth Token. You may already have an one, which you can use to create the necessary cloud credential.
If you do not have an Auth Token, you can create one from your cloud console: Go to your own user under Identity->Users and select the option to create an Auth Token. Note that you can only have very few Auth Tokens and that the token identifier (a string of some 20 random characters) will only be displayed once, so you need to save it safely.
You subsequently need to add your Auth Token as a credential in the ADMIN account of your autonomous database, which is done using dbms_cloud.create_credential. A typical call to do this would be to execute something like the following using sqlplus:
begin
dbms_cloud.create_credential
( 'credname'
, username=>'first.last@example.com'
, password=>'abc.123-defgji>XYZ('
);
end;
where the password is the contents of the Auth Token and the username is your email address. If you have access to multiple autonomous databases in the same tenancy, you need to add the Auth Token by calling dbms_cloud.create_credentail once for each database.
From your OCI console, go to the Object Storage and create a bucket. This step needs to be repeated for multiple regions, but the same bucket can very well be used for multiple databases in the same region.
A bucket is identified by a total of four values: The name of your cloud tenancy, the name of the region, the name of the credential created in the previous step, and the actual bucket name. All these must be provided to awrdump in order to do the actual upload. A sample call to both create a dump and to upload it is:
awrdump -l username/{password}@database_tp --dbid=2286054548 --begin-snap=96950 --end-snap=97047 \
--directory=DATA_PUMP_DIR \
--credential=credname --region=us-phoenix-1 \
--tenancy=mytenancy --bucket=awrbucket
RWP*AWR Dump Repository Release 3.0.3.17 Development on Thu, 24 Nov 2022 11:01:42 UTC
Connected default database to:
Oracle Database 19c Enterprise Edition Release 19.0.0.0.0 - Production
Dump file 2286054548_96950_97047.dmp of size 140652544 (134.1MiB) found in directory DATA_PUMP_DIR
Copying 2286054548_96950_97047.dmp to https://swiftobjectstorage.us-phoenix-1.oraclecloud.com/v1/mytenancy/awrbucket/2286054548_96950_97047.dmp
If you already have created the dump file and you just want to upload it to a bucket, add the --copyonly option which causes the first part with the actual dump creation to be skipped.
You can finally use the cloud console to identify the file and either download it or create a pre-authenticated URL for it.