@@ -207,6 +207,8 @@ You can set them using the ``with_{property}`` functions:
207
207
- ``with_spark_version ``
208
208
- ``with_warehouse_bucket_uri ``
209
209
- ``with_private_endpoint_id `` (`doc <https://docs.oracle.com/en-us/iaas/data-flow/using/pe-allowing.htm#pe-allowing >`__)
210
+ - ``with_defined_tags ``
211
+ - ``with_freeform_tags ``
210
212
211
213
For more details, see `Data Flow class documentation <https://docs.oracle.com/en-us/iaas/tools/ads-sdk/latest/ads.jobs.html#module-ads.jobs.builders.infrastructure.dataflow >`__.
212
214
@@ -229,10 +231,10 @@ create applications.
229
231
230
232
In the following "hello-world" example, ``DataFlow `` is populated with ``compartment_id ``,
231
233
``driver_shape ``, ``driver_shape_config ``, ``executor_shape ``, ``executor_shape_config ``
232
- and ``spark_version ``. ``DataFlowRuntime `` is populated with `` script_uri `` and
233
- `` script_bucket ``. The ``script_uri `` specifies the path to the script. It can be
234
- local or remote (an Object Storage path). If the path is local, then
235
- ``script_bucket `` must be specified additionally because Data Flow
234
+ , ``spark_version ``, ``defined_tags `` and `` freeform_tags ``. `` DataFlowRuntime `` is
235
+ populated with `` script_uri `` and `` script_bucket ``. The ``script_uri `` specifies the
236
+ path to the script. It can be local or remote (an Object Storage path). If the path
237
+ is local, then ``script_bucket `` must be specified additionally because Data Flow
236
238
requires a script to be available in Object Storage. ADS
237
239
performs the upload step for you, as long as you give the bucket name
238
240
or the Object Storage path prefix to upload the script. Either can be
@@ -272,6 +274,10 @@ accepted. In the next example, the prefix is given for ``script_bucket``.
272
274
.with_executor_shape(" VM.Standard.E4.Flex" )
273
275
.with_executor_shape_config(ocpus = 4 , memory_in_gbs = 64 )
274
276
.with_spark_version(" 3.0.2" )
277
+ .with_defined_tag(
278
+ ** {" Oracle-Tags" : {" CreatedBy" : " test_name@oracle.com" }}
279
+ )
280
+ .with_freeform_tag(test_freeform_key = " test_freeform_value" )
275
281
)
276
282
runtime_config = (
277
283
DataFlowRuntime()
@@ -393,6 +399,10 @@ In the next example, ``archive_uri`` is given as an Object Storage location.
393
399
" spark.driverEnv.myEnvVariable" : " value1" ,
394
400
" spark.executorEnv.myEnvVariable" : " value2" ,
395
401
})
402
+ .with_defined_tag(
403
+ ** {" Oracle-Tags" : {" CreatedBy" : " test_name@oracle.com" }}
404
+ )
405
+ .with_freeform_tag(test_freeform_key = " test_freeform_value" )
396
406
)
397
407
runtime_config = (
398
408
DataFlowRuntime()
@@ -566,6 +576,11 @@ into the ``Job.from_yaml()`` function to build a Data Flow job:
566
576
numExecutors : 1
567
577
sparkVersion : 3.2.1
568
578
privateEndpointId : <private_endpoint_ocid>
579
+ definedTags :
580
+ Oracle-Tags :
581
+ CreatedBy : test_name@oracle.com
582
+ freeformTags :
583
+ test_freeform_key : test_freeform_value
569
584
type : dataFlow
570
585
name : dataflow_app_name
571
586
runtime :
@@ -647,6 +662,12 @@ into the ``Job.from_yaml()`` function to build a Data Flow job:
647
662
configuration :
648
663
required : false
649
664
type : dict
665
+ definedTags :
666
+ required : false
667
+ type : dict
668
+ freeformTags :
669
+ required : false
670
+ type : dict
650
671
type :
651
672
allowed :
652
673
- dataFlow
@@ -694,7 +715,10 @@ into the ``Job.from_yaml()`` function to build a Data Flow job:
694
715
configuration :
695
716
required : false
696
717
type : dict
697
- freeform_tag :
718
+ definedTags :
719
+ required : false
720
+ type : dict
721
+ freeformTags :
698
722
required : false
699
723
type : dict
700
724
scriptBucket :
0 commit comments