File tree Expand file tree Collapse file tree 1 file changed +42
-0
lines changed
docs/source/user_guide/jobs Expand file tree Collapse file tree 1 file changed +42
-0
lines changed Original file line number Diff line number Diff line change @@ -120,6 +120,48 @@ see also `ADS Logging <../logging/logging.html>`_.
120
120
121
121
With logging configured, you can call :py:meth: `~ads.jobs.DataScienceJobRun.watch ` method to stream the logs.
122
122
123
+ Mounting File Systems
124
+ ---------------------
125
+
126
+ Data Science Job supports mounting multiple types of file systems,
127
+ see `Data Science Job Mounting File Systems <place_holder >`_. A maximum number of 5 file systems are
128
+ allowed to be mounted for each Data Science Job. You can specify a list of file systems to be mounted
129
+ by calling :py:meth: `~ads.jobs.DataScienceJob.with_storage_mount() `. For each file system to be mounted,
130
+ you need to pass a dictionary with `src ` and `dest ` as keys. For example, you can pass
131
+ *<mount_target_ip_address>@<export_path> * as the value for `src ` to mount OCI File Storage. The value of
132
+ `dest ` must be the folder to which you want to mount the file system. See example below.
133
+
134
+ .. tabs ::
135
+
136
+ .. code-tab :: python
137
+ :caption: Python
138
+
139
+ from ads.jobs import DataScienceJob
140
+
141
+ infrastructure = (
142
+ DataScienceJob()
143
+ .with_log_group_id("<log_group_ocid>")
144
+ .with_log_id("<log_ocid>")
145
+ .with_storage_mount(
146
+ {
147
+ "src" : "<mount_target_ip_address>@<export_path>",
148
+ "dest" : "<destination_directory_name>"
149
+ }
150
+ )
151
+ )
152
+
153
+ .. code-tab :: yaml
154
+ :caption: YAML
155
+
156
+ kind: infrastructure
157
+ type: dataScienceJob
158
+ spec:
159
+ logGroupId: <log_group_ocid>
160
+ logId: <log_ocid>
161
+ storageMount:
162
+ - src: <mount_target_ip_address>@<export_path>
163
+ dest: <destination_directory_name>
164
+
123
165
Runtime
124
166
=======
125
167
You can’t perform that action at this time.
0 commit comments