Skip to content

Commit dde5ab5

Browse files
committed
Merge branch 'main' into ODSC-34935/add_flex_shape_opctl_config
2 parents ae3d1ea + b380572 commit dde5ab5

24 files changed

+159
-547
lines changed

.github/workflows/run-unittests-default_setup.yml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,7 @@
11
name: tests/unitary/default_setup/**
22

33
on:
4+
workflow_dispatch:
45
pull_request:
56
branches:
67
- main

.github/workflows/run-unittests.yml

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,7 @@
11
name: tests/unitary/**
22

33
on:
4+
workflow_dispatch:
45
pull_request:
56
branches:
67
- main
@@ -41,7 +42,7 @@ jobs:
4142
test-path: ["tests/unitary", "tests/unitary/with_extras/model"]
4243
include:
4344
- test-path: "tests/unitary"
44-
ignore-path: "tests/unitary/with_extras/model"
45+
ignore-path: "--ignore tests/unitary/with_extras/model --ignore tests/unitary/with_extras/feature_store"
4546
name: "unitary"
4647
- test-path: "tests/unitary/with_extras/model"
4748
name: "model"
@@ -115,8 +116,7 @@ jobs:
115116
# Run tests
116117
python -m pytest -v -p no:warnings --durations=5 \
117118
-n auto --dist loadfile ${{ matrix.cov-reports }} \
118-
${{ matrix.test-path }} \
119-
--ignore "${{ matrix.ignore-path }}"
119+
${{ matrix.test-path }} ${{ matrix.ignore-path }}
120120
121121
- name: "Save coverage files"
122122
uses: actions/upload-artifact@v3

ads/pipeline/ads_pipeline.py

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1904,7 +1904,7 @@ def __override_configurations(
19041904
# ) as ml_step_schema_file:
19051905
# ml_step_schema = json.load(ml_step_schema_file)
19061906

1907-
# yaml_dict = yaml.load(Pipeline._read_from_file(uri=uri))
1907+
# yaml_dict = yaml.load(Pipeline._read_from_file(uri=uri), Loader=yaml.FullLoader)
19081908

19091909
# pipeline_validator = Validator(pipeline_schema)
19101910
# if not pipeline_validator.validate(yaml_dict):
@@ -1996,7 +1996,6 @@ def init(self) -> "Pipeline":
19961996
)
19971997

19981998

1999-
20001999
class DataSciencePipeline(OCIDataScienceMixin, oci.data_science.models.Pipeline):
20012000
@classmethod
20022001
def from_ocid(cls, ocid: str) -> "DataSciencePipeline":
@@ -2277,4 +2276,4 @@ def delete(
22772276
operation_kwargs=operation_kwargs,
22782277
waiter_kwargs=waiter_kwargs,
22792278
)
2280-
return self.sync()
2279+
return self.sync()

ads/secrets/secrets.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
#!/usr/bin/env python
22
# -*- coding: utf-8 -*--
33

4-
# Copyright (c) 2021, 2022 Oracle and/or its affiliates.
4+
# Copyright (c) 2021, 2023 Oracle and/or its affiliates.
55
# Licensed under the Universal Permissive License v 1.0 as shown at https://oss.oracle.com/licenses/upl/
66

77
import ads
@@ -273,7 +273,7 @@ def load_secret(
273273
if format.lower() == "json":
274274
vault_info = json.load(vf)
275275
elif format.lower() in ["yaml", "yml"]:
276-
vault_info = yaml.load(vf)
276+
vault_info = yaml.load(vf, Loader=yaml.FullLoader)
277277
if not cls._validate_required_vault_attributes(vault_info):
278278
logger.error(
279279
f"Missing required Attributes in file {uri}: {cls.required_keys}"

docs/source/ads.explanations.rst

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,2 @@
1+
ads.explanations package
2+
========================

docs/source/ads.jobs.rst

Lines changed: 0 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,12 +1,6 @@
11
ads.jobs package
22
================
33

4-
.. toctree::
5-
:maxdepth: 3
6-
7-
ads.jobs
8-
9-
104
Subpackages
115
-----------
126

docs/source/conf.py

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -173,3 +173,11 @@
173173
todo_include_todos = True
174174

175175
mathjax_path = "math_jax_3_2_0.js"
176+
177+
# This css will be included in htlm pages to process where we add .. raw:: html for nb cell nice outputs with
178+
# <div class="nboutput nblast docutils container"> and
179+
# <div class="output_area rendered_html docutils container">
180+
# See ads_tuner.rst dataframe output in .. raw:: html sections
181+
html_css_files = [
182+
"nbsphinx-code-cells.css"
183+
]

docs/source/index.rst

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -97,6 +97,7 @@ Oracle Accelerated Data Science (ADS)
9797
`https://github.com/oracle/accelerated-data-science <https://github.com/oracle/accelerated-data-science>`_
9898

9999
.. code-block:: python3
100+
100101
>>> import ads
101102
>>> ads.hello()
102103

docs/source/user_guide/big_data_service/file_management.rst

Lines changed: 3 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -71,7 +71,7 @@ Upload
7171
------
7272

7373
The `.put() <https://filesystem-spec.readthedocs.io/en/latest/api.html#fsspec.spec.AbstractFileSystem.put>`_ method is used to upload files from local storage to HDFS. The first parameter is the local path of the files to upload. The second parameter is the HDFS path where the files are to be stored.
74-
`.upload() <https://filesystem-spec.readthedocs.io/en/latest/api.html#fsspec.spec.AbstractFileSystem.upload>`_ is an alias of `.put()`.
74+
`.upload() <https://filesystem-spec.readthedocs.io/en/latest/api.html#fsspec.spec.AbstractFileSystem.upload>`_ is an alias of ``.put()``.
7575
.. code-block:: python3
7676
7777
fs.put(
@@ -82,7 +82,7 @@ The `.put() <https://filesystem-spec.readthedocs.io/en/latest/api.html#fsspec.sp
8282
Ibis
8383
====
8484

85-
`Ibis <https://github.com/ibis-project/ibis>`_ is an open-source library by `Cloudera <https://www.cloudera.com/>`_ that provides a Python framework to access data and perform analytical computations from different sources. Ibis allows access to the data ising HDFS. You use the ``ibis.impala.hdfs_connect()`` method to make a connection to HDFS, and it returns a handler. This handler has methods such as ``.ls()`` to list, ``.get()`` to download, ``.put()`` to upload, and ``.rm()`` to delete files. These operations support globbing. Ibis' HDFS connector supports a variety of `additional operations <https://ibis-project.org/docs/dev/backends/Impala/#hdfs-interaction>`_.
85+
`Ibis <https://github.com/ibis-project/ibis>`_ is an open-source library by `Cloudera <https://www.cloudera.com/>`_ that provides a Python framework to access data and perform analytical computations from different sources. Ibis allows access to the data ising HDFS. You use the ``ibis.impala.hdfs_connect()`` method to make a connection to HDFS, and it returns a handler. This handler has methods such as ``.ls()`` to list, ``.get()`` to download, ``.put()`` to upload, and ``.rm()`` to delete files. These operations support globbing. Ibis' HDFS connector supports a variety of `additional operations <https://ibis-project.org/backends/impala/#hdfs-interaction>`_.
8686

8787
Connect
8888
-------
@@ -159,7 +159,7 @@ Use the `.put() <https://filesystem-spec.readthedocs.io/en/latest/api.html#fsspe
159159
Pandas
160160
======
161161

162-
Pandas allows access to BDS' HDFS system through :ref: `FSSpec`. This section demonstrates some common operations.
162+
Pandas allows access to BDS' HDFS system through :ref:`FSSpec`. This section demonstrates some common operations.
163163

164164
Connect
165165
-------
@@ -259,4 +259,3 @@ The following sample code shows several different PyArrow methods for working wi
259259
table = pa.Table.from_pandas(df)
260260
pq.write_to_dataset(table, root_path="/path/on/BDS/HDFS", partition_cols=["dt"],
261261
flavor="spark", filesystem=fs)
262-

docs/source/user_guide/big_data_service/sql_data_management.rst

Lines changed: 1 addition & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ Ibis
1313
Connect
1414
-------
1515

16-
Obtaining a Kerberos ticket, depending on your system configuration, you may need to define the ``ibis.options.impala.temp_db`` and ``ibis.options.impala.temp_hdfs_path`` options. The ``ibis.impala.connect()`` method makes a connection to the `Impala execution backend <https://ibis-project.org/docs/dev/backends/Impala/>`_. The ``.sql()`` allows you to run SQL commands on the data.
16+
Obtaining a Kerberos ticket, depending on your system configuration, you may need to define the ``ibis.options.impala.temp_db`` and ``ibis.options.impala.temp_hdfs_path`` options. The ``ibis.impala.connect()`` method makes a connection to the `Impala execution backend <https://ibis-project.org/backends/impala/>`_. The ``.sql()`` allows you to run SQL commands on the data.
1717

1818
.. code-block:: python3
1919
@@ -167,5 +167,3 @@ It is important to close sessions when you don't need them anymore. This frees u
167167
.. code-block:: python3
168168
169169
cursor.close()
170-
171-

0 commit comments

Comments
 (0)