-
Notifications
You must be signed in to change notification settings - Fork 19
OCP ‐ The scp for a database
The ocp utility is modelled over the scp utility and it can be used to copy a file between your system running ocp and a directory in a database. As simple examples, ocp -l admin/{password}@//host/service myfile.txt DIRECTORY:
will copy the local file called myfile.txt to the named directory in the database, and similarly ocp -l admin/{password}@//host/service DIRECTORY:thatfile.zip .
will copy the file called thatfile.zip from the named database directory to your local system. The database can be an Autonomous Database or an on premise database.
In addition to copying to/from your local system from where you execute the ocp command, you can copy files between an Autonomous Database and a cloud bucket.
If your database is an autonomous database in Oracle Cloud, you do not have access to directories on the database server, including the DATA_PUMP_DIR above. To make the dump accessible it must therefore first be uploaded to a bucket in your region and tenancy. To do this, there is some preparation that must be done for your tenancy, user and region, which involves these steps:
In order to copy an awr dump file from a database directory to a cloud bucket, you need to have an Auth Token. You may already have an one, which you can use to create the necessary cloud credential.
If you do not have an Auth Token, you can create one from your cloud console: Go to your own user under Identity->Users and select the option to create an Auth Token. Note that you can only have very few Auth Tokens and that the token identifier (a string of some 20 random characters) will only be displayed once, so you need to save it safely.
You subsequently need to add your Auth Token as a credential in the ADMIN account of your autonomous database, which is done using dbms_cloud.create_credential. A typical call to do this would be to execute something like the following using sqlplus:
begin
dbms_cloud.create_credential
( 'credname'
, username=>'first.last@example.com'
, password=>'abc.123-defgji>XYZ('
);
end;
where the password is the contents of the Auth Token and the username is your email address. If you have access to multiple autonomous databases in the same tenancy, you need to add the Auth Token by calling dbms_cloud.create_credentail once for each database.
From your OCI console, go to the Object Storage and create a bucket. This step needs to be repeated for multiple regions, but the same bucket can very well be used for multiple databases in the same region.
A bucket is identified by a total of four values: The name of your cloud tenancy, the name of the region, the name of the credential created in the previous step, and the actual bucket name. All these must be provided to awrdump in order to do the actual upload. A sample call to both create a dump and to upload it is:
awrdump -l username/{password}@database_tp --dbid=2286054548 --begin-snap=96950 --end-snap=97047 \
--directory=DATA_PUMP_DIR \
--credential=credname --region=us-phoenix-1 \
--tenancy=mytenancy --bucket=awrbucket
RWP*AWR Dump Repository Release 3.0.3.17 Development on Thu, 24 Nov 2022 11:01:42 UTC
Connected default database to:
Oracle Database 19c Enterprise Edition Release 19.0.0.0.0 - Production
Dump file 2286054548_96950_97047.dmp of size 140652544 (134.1MiB) found in directory DATA_PUMP_DIR
Copying 2286054548_96950_97047.dmp to https://swiftobjectstorage.us-phoenix-1.oraclecloud.com/v1/mytenancy/awrbucket/2286054548_96950_97047.dmp
If you already have created the dump file and you just want to upload it to a bucket, add the --copyonly option which causes the first part with the actual dump creation to be skipped.
You can finally use the cloud console to identify the file and either download it or create a pre-authenticated URL for it.