Skip to content

Commit ff6c294

Browse files
committed
Cleaned version
1 parent 1ce0b73 commit ff6c294

File tree

2 files changed

+3
-9
lines changed

2 files changed

+3
-9
lines changed

docs/source/user_guide/model_training/distributed_training/ray/creating.rst

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -179,8 +179,6 @@ Do a dry run to inspect how the yaml translates to Job and Job Runs
179179
ads opctl run -f train.yaml --dry-run
180180
181181
182-
**Use ads opctl to create the cluster infrastructure and run the workload:**
183-
184182
.. include:: ../_test_and_submit.rst
185183

186184
**Monitoring the workload logs**

docs/source/user_guide/model_training/distributed_training/ray/ray.rst

Lines changed: 3 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -7,13 +7,9 @@ Ray is a framework for distributed computing in Python specialized in ML workloa
77
The documentation shows how to create a container and ``yaml`` spec to run a ``Ray``
88
code sample in distributed modality.
99

10-
11-
.. admonition:: Ray
12-
:class: note
13-
14-
``Ray`` offers a core package to simply execute Python workloads in a distributed manner,
15-
potentially across a cluster of machines (set up through ``Ray`` itself), but also other
16-
extensions to perform more traditional ML computation, such as Hyperparameter Optimization.
10+
``Ray`` offers a core package to simply execute Python workloads in a distributed manner,
11+
potentially across a cluster of machines (set up through ``Ray`` itself), but also other
12+
extensions to perform more traditional ML computation, such as Hyperparameter Optimization.
1713

1814

1915
.. toctree::

0 commit comments

Comments
 (0)