diff --git a/sdk/ai/azure-ai-projects/AGENTS_MIGRATION_GUIDE.md b/sdk/ai/azure-ai-projects/AGENTS_MIGRATION_GUIDE.md index a503988f9e4e..696535277984 100644 --- a/sdk/ai/azure-ai-projects/AGENTS_MIGRATION_GUIDE.md +++ b/sdk/ai/azure-ai-projects/AGENTS_MIGRATION_GUIDE.md @@ -1,5 +1,6 @@ -# Agents migration guide from Hub-based projects to Endpoint-based projects. -This guide describes migration from hub-based to Endpoint-based projects. To create a Endpoint-based project, please use one of the deployment scripts on [foundry samples repository](https://github.com/azure-ai-foundry/foundry-samples/tree/main/samples/microsoft/infrastructure-setup) appropriate for your scenario, also you can use Azure AI Foundry UI. The support of hub-based projects was dropped in `azure-ai-projects` version `1.0.0b11`. In this document, we show the operation implementation of before `1.0.0b11` in **Hub-based** secion, followed by code for `azure-ai-projects` version `1.0.0b11` or later in **Endpoint-based**. +# Agents migration guide from Hub-based projects to Endpoint-based projects + +This guide describes migration from hub-based to Endpoint-based projects. To create a Endpoint-based project, please use one of the deployment scripts on [foundry samples repository](https://github.com/azure-ai-foundry/foundry-samples/tree/main/samples/microsoft/infrastructure-setup) appropriate for your scenario, also you can use Azure AI Foundry UI. The support of hub-based projects was dropped in `azure-ai-projects` version `1.0.0b11`. In this document, we show the operation implementation of before `1.0.0b11` in **Hub-based** section, followed by code for `azure-ai-projects` version `1.0.0b11` or later in **Endpoint-based**. ## Import changes @@ -109,9 +110,11 @@ Files Operations | `project_client.agents.delete_file` | `project_client.agents.files.delete` | ## API changes + 1. Create project. The connection string is replaced by the endpoint. The project endpoint URL has the form https://\.services.ai.azure.com/api/projects/\. It can be found in your Azure AI Foundry Project overview page. **Hub-based** + ```python project_client = AIProjectClient.from_connection_string( credential=DefaultAzureCredential(), @@ -120,12 +123,15 @@ Files Operations ``` **Endpoint-based** + ```python project_client = AIProjectClient(endpoint=endpoint, credential=DefaultAzureCredential()) ``` + 2. Crate an agent. In the new versions of SDK, the agent can be created using project client or directly created by using `AgentsClient` constructor. In the code below, `project_client.agents`is an `AgentsClient` instance so `project_client.agents` and `agents_client` can be used interchangeably. For simplicity we will use ` project_client.agents `. **Hub-based** + ```python agent = project_client.agents.create_agent( model= "gpt-4o", @@ -133,9 +139,11 @@ Files Operations instructions="You are helpful assistant", ) ``` - **Endpoint-based** - Agent is instantiated using `AIProjectClient ` + **Endpoint-based** + + Agent is instantiated using `AIProjectClient` + ```python agent = project_client.agents.create_agent( model="gpt-4o", @@ -145,6 +153,7 @@ Files Operations ``` Agent is instantiated using `AgentsClient` constructor: + ```python from azure.ai.agents import AgentsClient @@ -158,9 +167,11 @@ Files Operations instructions="You are helpful agent", ) ``` + 3. List agents. New version of SDK allows more convenient ways of listing threads, messages and agents by returning `ItemPaged` and `AsyncItemPaged`. The list of returned items is split by pages, which may be consequently returned to user. Below we will demonstrate this mechanism for agents. The `limit` parameter defines the number of items on the page. This example is also applicable for listing threads, runs, run steps, vector stores, files in vector store, and messages. **Hub-based** + ```python has_more = True last_id = None @@ -173,6 +184,7 @@ Files Operations ``` **Endpoint-based** + ```python agents = project_client.agents.list_agents(limit=2) # Iterate items by page. Each page will be limited by two items. @@ -190,12 +202,14 @@ Files Operations 4. Delete agent. In versions azure-ai-projects 1.0.0b11, all deletion operations used to return deletion status, for example, deletion of agent was returning `AgentDeletionStatus`. In 1.0.0b11 and later, these operations do not return a value. **Hub-based** + ```python deletion_status = project_client.agents.delete_agent(agent.id) print(deletion_status.deleted) ``` **Endpoint-based** + ```python project_client.agents.delete_agent(agent.id) ``` @@ -203,16 +217,21 @@ Files Operations 5. Create a thread. **Hub-based** + ```python thread = project_client.agents.create_thread() ``` + **Endpoint-based** + ```python thread = project_client.agents.threads.create() ``` + 6. List threads. **Hub-based** + ```python with project_client: last_id = None @@ -229,6 +248,7 @@ Files Operations ``` **Endpoint-based** + ```python threads = project_client.agents.threads.list(limit=2) # Iterate items by page. Each page will be limited by two items. @@ -246,42 +266,52 @@ Files Operations 7. Delete the thread. In previous SDK thread deletion used to return ` ThreadDeletionStatus` object, while in new version it does not return value. **Hub-based** + ```python delete_status = project_client.agents.delete_thread(tread_id) print(delete_status.deleted) ``` **Endpoint-based** + ```python project_client.agents.threads.delete(tread_id) ``` + 8. Create the message on a thread. **Hub-based** + ```python message = project_client.agents.create_message(thread_id=thread.id, role="user", content="The message text.") ``` **Endpoint-based** + ```python message = agents_client.messages.create(thread_id=thread.id, role="user", content=" The message text."") ``` + 9. Create and get the run. **Hub-based** + ```python run = project_client.agents.runs.create(thread_id=thread.id, agent_id=agent.id) run = project_client.agents.get_run(thread_id=thread.id, run_id=run.id) ``` **Endpoint-based** + ```python run = project_client.agents.runs.create(thread_id=thread.id, agent_id=agent.id) run = project_client.agents.runs.get(thread_id=thread.id, run_id=run.id) ``` + 10. List Runs. **Hub-based** + ```python has_more = True last_id = None @@ -294,6 +324,7 @@ Files Operations ``` **Endpoint-based** + ```python runs = project_client.agents.runs.list(thread.id) for one_run in runs: @@ -303,6 +334,7 @@ Files Operations 11. List Run steps. **Hub-based** + ```python has_more = True last_id = None @@ -315,6 +347,7 @@ Files Operations ``` **Endpoint-based** + ```python run_steps = project_client.agents.run_steps.list(thread.id, run.id) for one_run_step in run_steps: @@ -324,6 +357,7 @@ Files Operations 12. Using streams. **Hub-based** + ```python with project_client.agents.create_stream(thread_id=thread.id, agent_id=agent.id) as stream: for event_type, event_data, func_return in stream: @@ -334,6 +368,7 @@ Files Operations ``` **Endpoint-based** + ```python with project_client.agents.runs.stream(thread_id=thread.id, agent_id=agent.id, event_handler=MyEventHandler()) as stream: for event_type, event_data, func_return in stream: @@ -346,6 +381,7 @@ Files Operations 13. List messages. **Hub-based** + ```python messages = project_client.agents.list_messages(thread_id=thread.id) # In code below we assume that the number of messages fits one page for brevity. @@ -356,6 +392,7 @@ Files Operations ``` **Endpoint-based** + ```python messages = project_client.agents.messages.list(thread_id=thread.id) for msg in messages: @@ -363,9 +400,11 @@ Files Operations last_text = msg.text_messages[-1] print(f"{msg.role}: {last_text.text.value}") ``` + 14. Create, list and delete files are now handled by file operations, again, delete call in new SDK version does not return a value. **Hub-based** + ```python # Create file file = project_client.agents.upload_file_and_poll(file_path="product_info_1.md", purpose=FilePurpose.AGENTS) @@ -379,6 +418,7 @@ Files Operations ``` **Endpoint-based** + ```python # Create file file = project_client.agents.files.upload_and_poll(file_path=asset_file_path, purpose=FilePurpose.AGENTS) @@ -389,9 +429,11 @@ Files Operations # Delete file. project_client.agents.files.delete(file_id=file.id) ``` + 15. Create, list vector store files list and delete vector stores. **Hub-based** + ```python # Create a vector store with no file and wait for it to be processed vector_store = project_client.agents.create_vector_store_and_poll(file_ids=[file.id], name="sample_vector_store") @@ -421,6 +463,7 @@ Files Operations ``` **Endpoint-based** + ```python # Create a vector store with no file and wait for it to be processed vector_store = project_client.agents.vector_stores.create_and_poll(file_ids=[file.id], name="my_vectorstore") @@ -441,6 +484,7 @@ Files Operations 16. Vector store batch file search. **Hub-based** + ```python # Batch upload files vector_store_file_batch = project_client.agents.create_vector_store_file_batch_and_poll( @@ -460,6 +504,7 @@ Files Operations ``` **Endpoint-based** + ```python # Batch upload files vector_store_file_batch = project_client.agents.vector_store_file_batches.create_and_poll( diff --git a/sdk/ai/azure-ai-projects/CHANGELOG.md b/sdk/ai/azure-ai-projects/CHANGELOG.md index 6b0c9b52be2f..b212c9c4d377 100644 --- a/sdk/ai/azure-ai-projects/CHANGELOG.md +++ b/sdk/ai/azure-ai-projects/CHANGELOG.md @@ -1,5 +1,36 @@ # Release History +## 1.0.0b12 (Unreleased) + +### Breaking changes + +* These 3 methods on `AIProjectClient` were removed: `.inference.get_chat_completions_client()`, +`.inference.get_embeddings_client()`, `.inference.get_image_embeddings_client()`. +For guidance on obtaining an authenticated `azure-ai-inference` client for your AI Foundry Project, +refer to the updated samples in the `samples\inference\azure-ai-inference` directory. For example, +`sample_chat_completions_with_azure_ai_inference_client.py`. Alternatively, use the `.inference.get_azure_openai_client()` method +to get an authenticated `AzureOpenAI` client from the `openai` package, and perform chat completions. See samples +in the folder `samples\inference\azure-openai`. +* Method argument name changes: + * In method `.indexes.create_or_update()` argument `body` was renamed `index`. + * In method `.datasets.create_or_update()` argument `body` was renamed `dataset_version`. + * In method `.datasets.pending_upload()` argument `body` was renamed `pending_upload_request`. + +### Bugs Fixed + +* Fix to package function `enable_telemetry()` to correctly instrument `azure-ai-agents`. + +### Sample updates + +* Per mentioned above, the samples in the `samples\inference` folder were updated. They were moved into two new +sub-folders, one showing usage of AzureOpenAI client (no change there). The other showing usage of the +clients from the `azure-ai-inference` package (changes were made there). + +### Other + +* Set dependency on `azure-ai-agents` version `1.0.0` or above, +now that we have a stable release of the Agents package. + ## 1.0.0b11 (2025-05-15) There have been significant updates with the release of version 1.0.0b11, including breaking changes. diff --git a/sdk/ai/azure-ai-projects/README.md b/sdk/ai/azure-ai-projects/README.md index 1dda36fb6505..529ebff94745 100644 --- a/sdk/ai/azure-ai-projects/README.md +++ b/sdk/ai/azure-ai-projects/README.md @@ -9,10 +9,10 @@ resources in your Azure AI Foundry Project. Use it to: * **Enumerate connected Azure resources** in your Foundry project using the `.connections` operations. * **Upload documents and create Datasets** to reference them using the `.datasets` operations. * **Create and enumerate Search Indexes** using the `.indexes` operations. -* **Get an Azure AI Inference client** for chat completions, text or image embeddings using the `.inference` operations. * **Read a Prompty file or string** and render messages for inference clients, using the `PromptTemplate` class. * **Run Evaluations** to assess the performance of generative AI applications, using the `evaluations` operations. -* **Enable OpenTelemetry tracing** using the `enable_telemetry` function. + +The client library uses version `2025-05-15-preview` of the AI Foundry [data plane REST APIs](https://aka.ms/azsdk/azure-ai-projects/rest-api-reference). > **Note:** There have been significant updates with the release of version 1.0.0b11, including breaking changes. please see new code snippets below and the samples folder. Agents are now implemented in a separate package `azure-ai-agents` @@ -95,7 +95,7 @@ project_client = AIProjectClient.from_connection_string( ### Performing Agent operations -The `.agents` property on the `AIProjectsClient` gives you access to an authenticated `AgentsClient` from the `azure-ai-agents` package. Below we show how to create an Agent and delete it. To see what you can do with the `agent` you created, see the [many samples](https://github.com/Azure/azure-sdk-for-python/tree/main/sdk/ai/azure-ai-agents/samples) associated with the `azure-ai-agents` package. +The `.agents` property on the `AIProjectsClient` gives you access to an authenticated `AgentsClient` from the `azure-ai-agents` package. Below we show how to create an Agent and delete it. To see what you can do with the Agent you created, see the [many samples](https://github.com/Azure/azure-sdk-for-python/tree/main/sdk/ai/azure-ai-agents/samples) and the [README.md](https://github.com/Azure/azure-sdk-for-python/tree/main/sdk/ai/azure-ai-agents) file of the dependent `azure-ai-agents` package. The code below assumes `model_deployment_name` (a string) is defined. It's the deployment name of an AI model in your Foundry Project, as shown in the "Models + endpoints" tab, under the "Name" column. @@ -120,9 +120,13 @@ print("Deleted agent") ### Get an authenticated AzureOpenAI client -Your Azure AI Foundry project may have one or more OpenAI models deployed that support chat completions. Use the code below to get an authenticated [AzureOpenAI](https://github.com/openai/openai-python?tab=readme-ov-file#microsoft-azure-openai) from the [openai](https://pypi.org/project/openai/) package, and execute a chat completions call. +Your Azure AI Foundry project may have one or more AI models deployed that support chat completions. +These could be OpenAI models, Microsoft models, or models from other providers. +Use the code below to get an authenticated [AzureOpenAI](https://github.com/openai/openai-python?tab=readme-ov-file#microsoft-azure-openai) +from the [openai](https://pypi.org/project/openai/) package, and execute a chat completions call. -The code below assumes `model_deployment_name` (a string) is defined. It's the deployment name of an AI model in your Foundry Project, or a connected Azure OpenAI resource. As shown in the "Models + endpoints" tab, under the "Name" column. +The code below assumes `model_deployment_name` (a string) is defined. It's the deployment name of an AI model in your +Foundry Project, or a connected Azure OpenAI resource. As shown in the "Models + endpoints" tab, under the "Name" column. Update the `api_version` value with one found in the "Data plane - inference" row [in this table](https://learn.microsoft.com/azure/ai-services/openai/reference#api-specs). @@ -170,34 +174,6 @@ with project_client.inference.get_azure_openai_client( See the "inference" folder in the [package samples][samples] for additional samples. -### Get an authenticated ChatCompletionsClient - -Your Azure AI Foundry project may have one or more AI models deployed that support chat completions. These could be OpenAI models, Microsoft models, or models from other providers. Use the code below to get an authenticated [ChatCompletionsClient](https://learn.microsoft.com/python/api/azure-ai-inference/azure.ai.inference.chatcompletionsclient) from the [azure-ai-inference](https://pypi.org/project/azure-ai-inference/) package, and execute a chat completions call. - -First, install the package: - -```bash -pip install azure-ai-inference -``` - -Then run the code below. Here we assume `model_deployment_name` (a string) is defined. It's the deployment name of an AI model in your Foundry Project, as shown in the "Models + endpoints" tab, under the "Name" column. - - - -```python -with project_client.inference.get_chat_completions_client() as client: - - response = client.complete( - model=model_deployment_name, messages=[UserMessage(content="How many feet are in a mile?")] - ) - - print(response.choices[0].message.content) -``` - - - -See the "inference" folder in the [package samples][samples] for additional samples, including getting an authenticated [EmbeddingsClient](https://learn.microsoft.com/python/api/azure-ai-inference/azure.ai.inference.embeddingsclient) and [ImageEmbeddingsClient](https://learn.microsoft.com/python/api/azure-ai-inference/azure.ai.inference.imageembeddingsclient). - ### Deployments operations The code below shows some Deployments operations, which allow you to enumerate the AI models deployed to your AI Foundry Projects. These models can be seen in the "Models + endpoints" tab in your AI Foundry Project. Full samples can be found under the "deployment" folder in the [package samples][samples]. @@ -330,7 +306,7 @@ print( index = project_client.indexes.create_or_update( name=index_name, version=index_version, - body=AzureAISearchIndex(connection_name=ai_search_connection_name, index_name=ai_search_index_name), + index=AzureAISearchIndex(connection_name=ai_search_connection_name, index_name=ai_search_index_name), ) print(index) diff --git a/sdk/ai/azure-ai-projects/_metadata.json b/sdk/ai/azure-ai-projects/_metadata.json new file mode 100644 index 000000000000..253921f335be --- /dev/null +++ b/sdk/ai/azure-ai-projects/_metadata.json @@ -0,0 +1,3 @@ +{ + "apiVersion": "2025-05-15-preview" +} \ No newline at end of file diff --git a/sdk/ai/azure-ai-projects/apiview-properties.json b/sdk/ai/azure-ai-projects/apiview-properties.json index 29de53e42619..bc33249851dd 100644 --- a/sdk/ai/azure-ai-projects/apiview-properties.json +++ b/sdk/ai/azure-ai-projects/apiview-properties.json @@ -55,30 +55,30 @@ "azure.ai.projects.aio.operations.EvaluationsOperations.create": "Azure.AI.Projects.Evaluations.create", "azure.ai.projects.operations.EvaluationsOperations.create_agent_evaluation": "Azure.AI.Projects.Evaluations.createAgentEvaluation", "azure.ai.projects.aio.operations.EvaluationsOperations.create_agent_evaluation": "Azure.AI.Projects.Evaluations.createAgentEvaluation", - "azure.ai.projects.operations.DatasetsOperations.list_versions": "Azure.AI.Projects.ServicePatterns.Datasets.listVersions", - "azure.ai.projects.aio.operations.DatasetsOperations.list_versions": "Azure.AI.Projects.ServicePatterns.Datasets.listVersions", - "azure.ai.projects.operations.DatasetsOperations.list": "Azure.AI.Projects.ServicePatterns.Datasets.listLatest", - "azure.ai.projects.aio.operations.DatasetsOperations.list": "Azure.AI.Projects.ServicePatterns.Datasets.listLatest", - "azure.ai.projects.operations.DatasetsOperations.get": "Azure.AI.Projects.ServicePatterns.Datasets.getVersion", - "azure.ai.projects.aio.operations.DatasetsOperations.get": "Azure.AI.Projects.ServicePatterns.Datasets.getVersion", - "azure.ai.projects.operations.DatasetsOperations.delete": "Azure.AI.Projects.ServicePatterns.Datasets.deleteVersion", - "azure.ai.projects.aio.operations.DatasetsOperations.delete": "Azure.AI.Projects.ServicePatterns.Datasets.deleteVersion", - "azure.ai.projects.operations.DatasetsOperations.create_or_update": "Azure.AI.Projects.ServicePatterns.Datasets.createOrUpdateVersion", - "azure.ai.projects.aio.operations.DatasetsOperations.create_or_update": "Azure.AI.Projects.ServicePatterns.Datasets.createOrUpdateVersion", + "azure.ai.projects.operations.DatasetsOperations.list_versions": "Azure.AI.Projects.Datasets.listVersions", + "azure.ai.projects.aio.operations.DatasetsOperations.list_versions": "Azure.AI.Projects.Datasets.listVersions", + "azure.ai.projects.operations.DatasetsOperations.list": "Azure.AI.Projects.Datasets.listLatest", + "azure.ai.projects.aio.operations.DatasetsOperations.list": "Azure.AI.Projects.Datasets.listLatest", + "azure.ai.projects.operations.DatasetsOperations.get": "Azure.AI.Projects.Datasets.getVersion", + "azure.ai.projects.aio.operations.DatasetsOperations.get": "Azure.AI.Projects.Datasets.getVersion", + "azure.ai.projects.operations.DatasetsOperations.delete": "Azure.AI.Projects.Datasets.deleteVersion", + "azure.ai.projects.aio.operations.DatasetsOperations.delete": "Azure.AI.Projects.Datasets.deleteVersion", + "azure.ai.projects.operations.DatasetsOperations.create_or_update": "Azure.AI.Projects.Datasets.createOrUpdateVersion", + "azure.ai.projects.aio.operations.DatasetsOperations.create_or_update": "Azure.AI.Projects.Datasets.createOrUpdateVersion", "azure.ai.projects.operations.DatasetsOperations.pending_upload": "Azure.AI.Projects.Datasets.startPendingUploadVersion", "azure.ai.projects.aio.operations.DatasetsOperations.pending_upload": "Azure.AI.Projects.Datasets.startPendingUploadVersion", "azure.ai.projects.operations.DatasetsOperations.get_credentials": "Azure.AI.Projects.Datasets.getCredentials", "azure.ai.projects.aio.operations.DatasetsOperations.get_credentials": "Azure.AI.Projects.Datasets.getCredentials", - "azure.ai.projects.operations.IndexesOperations.list_versions": "Azure.AI.Projects.ServicePatterns.Indexes.listVersions", - "azure.ai.projects.aio.operations.IndexesOperations.list_versions": "Azure.AI.Projects.ServicePatterns.Indexes.listVersions", - "azure.ai.projects.operations.IndexesOperations.list": "Azure.AI.Projects.ServicePatterns.Indexes.listLatest", - "azure.ai.projects.aio.operations.IndexesOperations.list": "Azure.AI.Projects.ServicePatterns.Indexes.listLatest", - "azure.ai.projects.operations.IndexesOperations.get": "Azure.AI.Projects.ServicePatterns.Indexes.getVersion", - "azure.ai.projects.aio.operations.IndexesOperations.get": "Azure.AI.Projects.ServicePatterns.Indexes.getVersion", - "azure.ai.projects.operations.IndexesOperations.delete": "Azure.AI.Projects.ServicePatterns.Indexes.deleteVersion", - "azure.ai.projects.aio.operations.IndexesOperations.delete": "Azure.AI.Projects.ServicePatterns.Indexes.deleteVersion", - "azure.ai.projects.operations.IndexesOperations.create_or_update": "Azure.AI.Projects.ServicePatterns.Indexes.createOrUpdateVersion", - "azure.ai.projects.aio.operations.IndexesOperations.create_or_update": "Azure.AI.Projects.ServicePatterns.Indexes.createOrUpdateVersion", + "azure.ai.projects.operations.IndexesOperations.list_versions": "Azure.AI.Projects.Indexes.listVersions", + "azure.ai.projects.aio.operations.IndexesOperations.list_versions": "Azure.AI.Projects.Indexes.listVersions", + "azure.ai.projects.operations.IndexesOperations.list": "Azure.AI.Projects.Indexes.listLatest", + "azure.ai.projects.aio.operations.IndexesOperations.list": "Azure.AI.Projects.Indexes.listLatest", + "azure.ai.projects.operations.IndexesOperations.get": "Azure.AI.Projects.Indexes.getVersion", + "azure.ai.projects.aio.operations.IndexesOperations.get": "Azure.AI.Projects.Indexes.getVersion", + "azure.ai.projects.operations.IndexesOperations.delete": "Azure.AI.Projects.Indexes.deleteVersion", + "azure.ai.projects.aio.operations.IndexesOperations.delete": "Azure.AI.Projects.Indexes.deleteVersion", + "azure.ai.projects.operations.IndexesOperations.create_or_update": "Azure.AI.Projects.Indexes.createOrUpdateVersion", + "azure.ai.projects.aio.operations.IndexesOperations.create_or_update": "Azure.AI.Projects.Indexes.createOrUpdateVersion", "azure.ai.projects.operations.DeploymentsOperations.get": "Azure.AI.Projects.Deployments.get", "azure.ai.projects.aio.operations.DeploymentsOperations.get": "Azure.AI.Projects.Deployments.get", "azure.ai.projects.operations.DeploymentsOperations.list": "Azure.AI.Projects.Deployments.list", diff --git a/sdk/ai/azure-ai-projects/assets.json b/sdk/ai/azure-ai-projects/assets.json new file mode 100644 index 000000000000..752d2238c55f --- /dev/null +++ b/sdk/ai/azure-ai-projects/assets.json @@ -0,0 +1,6 @@ +{ + "AssetsRepo": "Azure/azure-sdk-assets", + "AssetsRepoPrefixPath": "python", + "TagPrefix": "python/ai/azure-ai-projects", + "Tag": "python/ai/azure-ai-projects_25a915bc4c" +} diff --git a/sdk/ai/azure-ai-projects/azure/ai/projects/_client.py b/sdk/ai/azure-ai-projects/azure/ai/projects/_client.py index 217cb41b875a..4f134a04a6b9 100644 --- a/sdk/ai/azure-ai-projects/azure/ai/projects/_client.py +++ b/sdk/ai/azure-ai-projects/azure/ai/projects/_client.py @@ -29,7 +29,7 @@ from azure.core.credentials import TokenCredential -class AIProjectClient: # pylint: disable=too-many-instance-attributes +class AIProjectClient: """AIProjectClient. :ivar connections: ConnectionsOperations operations diff --git a/sdk/ai/azure-ai-projects/azure/ai/projects/_patch.py b/sdk/ai/azure-ai-projects/azure/ai/projects/_patch.py index 5da57d405f3d..8920a9e160cf 100644 --- a/sdk/ai/azure-ai-projects/azure/ai/projects/_patch.py +++ b/sdk/ai/azure-ai-projects/azure/ai/projects/_patch.py @@ -14,7 +14,6 @@ from ._client import AIProjectClient as AIProjectClientGenerated from .operations import TelemetryOperations, InferenceOperations from ._patch_prompts import PromptTemplate -from ._patch_telemetry import enable_telemetry _console_logging_enabled: bool = os.environ.get("ENABLE_AZURE_AI_PROJECTS_CONSOLE_LOGGING", "False").lower() in ( "true", @@ -133,7 +132,6 @@ def __exit__(self, *exc_details: Any) -> None: __all__: List[str] = [ "AIProjectClient", "PromptTemplate", - "enable_telemetry", ] # Add all objects you want publicly available to users at this package level diff --git a/sdk/ai/azure-ai-projects/azure/ai/projects/_version.py b/sdk/ai/azure-ai-projects/azure/ai/projects/_version.py index 319889e447e0..46f199f51a87 100644 --- a/sdk/ai/azure-ai-projects/azure/ai/projects/_version.py +++ b/sdk/ai/azure-ai-projects/azure/ai/projects/_version.py @@ -6,4 +6,4 @@ # Changes may cause incorrect behavior and will be lost if the code is regenerated. # -------------------------------------------------------------------------- -VERSION = "1.0.0b11" +VERSION = "1.0.0b12" diff --git a/sdk/ai/azure-ai-projects/azure/ai/projects/aio/_client.py b/sdk/ai/azure-ai-projects/azure/ai/projects/aio/_client.py index 7fc978f8c178..52a42cebdeab 100644 --- a/sdk/ai/azure-ai-projects/azure/ai/projects/aio/_client.py +++ b/sdk/ai/azure-ai-projects/azure/ai/projects/aio/_client.py @@ -29,7 +29,7 @@ from azure.core.credentials_async import AsyncTokenCredential -class AIProjectClient: # pylint: disable=too-many-instance-attributes +class AIProjectClient: """AIProjectClient. :ivar connections: ConnectionsOperations operations diff --git a/sdk/ai/azure-ai-projects/azure/ai/projects/aio/operations/_operations.py b/sdk/ai/azure-ai-projects/azure/ai/projects/aio/operations/_operations.py index 45f5d3d15a03..71fcc6e72e5f 100644 --- a/sdk/ai/azure-ai-projects/azure/ai/projects/aio/operations/_operations.py +++ b/sdk/ai/azure-ai-projects/azure/ai/projects/aio/operations/_operations.py @@ -9,7 +9,7 @@ from collections.abc import MutableMapping from io import IOBase import json -from typing import Any, AsyncIterable, Callable, Dict, IO, List, Optional, TypeVar, Union, overload +from typing import Any, Callable, Dict, IO, List, Optional, TypeVar, Union, overload import urllib.parse from azure.core import AsyncPipelineClient @@ -221,7 +221,7 @@ def list( connection_type: Optional[Union[str, _models.ConnectionType]] = None, default_connection: Optional[bool] = None, **kwargs: Any - ) -> AsyncIterable["_models.Connection"]: + ) -> AsyncItemPaged["_models.Connection"]: """List all connections in the project, without populating connection credentials. :keyword connection_type: List connections of this specific type. Known values are: @@ -403,7 +403,7 @@ async def get(self, name: str, **kwargs: Any) -> _models.Evaluation: method_added_on="2025-05-15-preview", params_added_on={"2025-05-15-preview": ["api_version", "client_request_id", "accept"]}, ) - def list(self, **kwargs: Any) -> AsyncIterable["_models.Evaluation"]: + def list(self, **kwargs: Any) -> AsyncItemPaged["_models.Evaluation"]: """List evaluation runs. :return: An iterator like instance of Evaluation @@ -749,7 +749,7 @@ def __init__(self, *args, **kwargs) -> None: self._deserialize: Deserializer = input_args.pop(0) if input_args else kwargs.pop("deserializer") @distributed_trace - def list_versions(self, name: str, **kwargs: Any) -> AsyncIterable["_models.DatasetVersion"]: + def list_versions(self, name: str, **kwargs: Any) -> AsyncItemPaged["_models.DatasetVersion"]: """List all versions of the given DatasetVersion. :param name: The name of the resource. Required. @@ -834,7 +834,7 @@ async def get_next(next_link=None): return AsyncItemPaged(get_next, extract_data) @distributed_trace - def list(self, **kwargs: Any) -> AsyncIterable["_models.DatasetVersion"]: + def list(self, **kwargs: Any) -> AsyncItemPaged["_models.DatasetVersion"]: """List the latest version of each DatasetVersion. :return: An iterator like instance of DatasetVersion @@ -1022,7 +1022,7 @@ async def delete(self, name: str, version: str, **kwargs: Any) -> None: response = pipeline_response.http_response - if response.status_code not in [204]: + if response.status_code not in [204, 200]: map_error(status_code=response.status_code, response=response, error_map=error_map) raise HttpResponseError(response=response) @@ -1034,7 +1034,7 @@ async def create_or_update( self, name: str, version: str, - body: _models.DatasetVersion, + dataset_version: _models.DatasetVersion, *, content_type: str = "application/merge-patch+json", **kwargs: Any @@ -1043,10 +1043,10 @@ async def create_or_update( :param name: The name of the resource. Required. :type name: str - :param version: The specific version id of the DatasetVersion to create or replace. Required. + :param version: The specific version id of the DatasetVersion to create or update. Required. :type version: str - :param body: The definition of the DatasetVersion to create or update. Required. - :type body: ~azure.ai.projects.models.DatasetVersion + :param dataset_version: The DatasetVersion to create or update. Required. + :type dataset_version: ~azure.ai.projects.models.DatasetVersion :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. Default value is "application/merge-patch+json". :paramtype content_type: str @@ -1057,16 +1057,22 @@ async def create_or_update( @overload async def create_or_update( - self, name: str, version: str, body: JSON, *, content_type: str = "application/merge-patch+json", **kwargs: Any + self, + name: str, + version: str, + dataset_version: JSON, + *, + content_type: str = "application/merge-patch+json", + **kwargs: Any ) -> _models.DatasetVersion: """Create a new or update an existing DatasetVersion with the given version id. :param name: The name of the resource. Required. :type name: str - :param version: The specific version id of the DatasetVersion to create or replace. Required. + :param version: The specific version id of the DatasetVersion to create or update. Required. :type version: str - :param body: The definition of the DatasetVersion to create or update. Required. - :type body: JSON + :param dataset_version: The DatasetVersion to create or update. Required. + :type dataset_version: JSON :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. Default value is "application/merge-patch+json". :paramtype content_type: str @@ -1080,7 +1086,7 @@ async def create_or_update( self, name: str, version: str, - body: IO[bytes], + dataset_version: IO[bytes], *, content_type: str = "application/merge-patch+json", **kwargs: Any @@ -1089,10 +1095,10 @@ async def create_or_update( :param name: The name of the resource. Required. :type name: str - :param version: The specific version id of the DatasetVersion to create or replace. Required. + :param version: The specific version id of the DatasetVersion to create or update. Required. :type version: str - :param body: The definition of the DatasetVersion to create or update. Required. - :type body: IO[bytes] + :param dataset_version: The DatasetVersion to create or update. Required. + :type dataset_version: IO[bytes] :keyword content_type: Body Parameter content-type. Content type parameter for binary body. Default value is "application/merge-patch+json". :paramtype content_type: str @@ -1103,17 +1109,17 @@ async def create_or_update( @distributed_trace_async async def create_or_update( - self, name: str, version: str, body: Union[_models.DatasetVersion, JSON, IO[bytes]], **kwargs: Any + self, name: str, version: str, dataset_version: Union[_models.DatasetVersion, JSON, IO[bytes]], **kwargs: Any ) -> _models.DatasetVersion: """Create a new or update an existing DatasetVersion with the given version id. :param name: The name of the resource. Required. :type name: str - :param version: The specific version id of the DatasetVersion to create or replace. Required. + :param version: The specific version id of the DatasetVersion to create or update. Required. :type version: str - :param body: The definition of the DatasetVersion to create or update. Is one of the following - types: DatasetVersion, JSON, IO[bytes] Required. - :type body: ~azure.ai.projects.models.DatasetVersion or JSON or IO[bytes] + :param dataset_version: The DatasetVersion to create or update. Is one of the following types: + DatasetVersion, JSON, IO[bytes] Required. + :type dataset_version: ~azure.ai.projects.models.DatasetVersion or JSON or IO[bytes] :return: DatasetVersion. The DatasetVersion is compatible with MutableMapping :rtype: ~azure.ai.projects.models.DatasetVersion :raises ~azure.core.exceptions.HttpResponseError: @@ -1134,10 +1140,10 @@ async def create_or_update( content_type = content_type or "application/merge-patch+json" _content = None - if isinstance(body, (IOBase, bytes)): - _content = body + if isinstance(dataset_version, (IOBase, bytes)): + _content = dataset_version else: - _content = json.dumps(body, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + _content = json.dumps(dataset_version, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore _request = build_datasets_create_or_update_request( name=name, @@ -1184,7 +1190,7 @@ async def pending_upload( self, name: str, version: str, - body: _models.PendingUploadRequest, + pending_upload_request: _models.PendingUploadRequest, *, content_type: str = "application/json", **kwargs: Any @@ -1195,8 +1201,8 @@ async def pending_upload( :type name: str :param version: The specific version id of the DatasetVersion to operate on. Required. :type version: str - :param body: Parameters for the action. Required. - :type body: ~azure.ai.projects.models.PendingUploadRequest + :param pending_upload_request: The pending upload request parameters. Required. + :type pending_upload_request: ~azure.ai.projects.models.PendingUploadRequest :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. Default value is "application/json". :paramtype content_type: str @@ -1207,7 +1213,13 @@ async def pending_upload( @overload async def pending_upload( - self, name: str, version: str, body: JSON, *, content_type: str = "application/json", **kwargs: Any + self, + name: str, + version: str, + pending_upload_request: JSON, + *, + content_type: str = "application/json", + **kwargs: Any ) -> _models.PendingUploadResponse: """Start a new or get an existing pending upload of a dataset for a specific version. @@ -1215,8 +1227,8 @@ async def pending_upload( :type name: str :param version: The specific version id of the DatasetVersion to operate on. Required. :type version: str - :param body: Parameters for the action. Required. - :type body: JSON + :param pending_upload_request: The pending upload request parameters. Required. + :type pending_upload_request: JSON :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. Default value is "application/json". :paramtype content_type: str @@ -1227,7 +1239,13 @@ async def pending_upload( @overload async def pending_upload( - self, name: str, version: str, body: IO[bytes], *, content_type: str = "application/json", **kwargs: Any + self, + name: str, + version: str, + pending_upload_request: IO[bytes], + *, + content_type: str = "application/json", + **kwargs: Any ) -> _models.PendingUploadResponse: """Start a new or get an existing pending upload of a dataset for a specific version. @@ -1235,8 +1253,8 @@ async def pending_upload( :type name: str :param version: The specific version id of the DatasetVersion to operate on. Required. :type version: str - :param body: Parameters for the action. Required. - :type body: IO[bytes] + :param pending_upload_request: The pending upload request parameters. Required. + :type pending_upload_request: IO[bytes] :keyword content_type: Body Parameter content-type. Content type parameter for binary body. Default value is "application/json". :paramtype content_type: str @@ -1247,7 +1265,11 @@ async def pending_upload( @distributed_trace_async async def pending_upload( - self, name: str, version: str, body: Union[_models.PendingUploadRequest, JSON, IO[bytes]], **kwargs: Any + self, + name: str, + version: str, + pending_upload_request: Union[_models.PendingUploadRequest, JSON, IO[bytes]], + **kwargs: Any ) -> _models.PendingUploadResponse: """Start a new or get an existing pending upload of a dataset for a specific version. @@ -1255,9 +1277,10 @@ async def pending_upload( :type name: str :param version: The specific version id of the DatasetVersion to operate on. Required. :type version: str - :param body: Parameters for the action. Is one of the following types: PendingUploadRequest, - JSON, IO[bytes] Required. - :type body: ~azure.ai.projects.models.PendingUploadRequest or JSON or IO[bytes] + :param pending_upload_request: The pending upload request parameters. Is one of the following + types: PendingUploadRequest, JSON, IO[bytes] Required. + :type pending_upload_request: ~azure.ai.projects.models.PendingUploadRequest or JSON or + IO[bytes] :return: PendingUploadResponse. The PendingUploadResponse is compatible with MutableMapping :rtype: ~azure.ai.projects.models.PendingUploadResponse :raises ~azure.core.exceptions.HttpResponseError: @@ -1278,10 +1301,10 @@ async def pending_upload( content_type = content_type or "application/json" _content = None - if isinstance(body, (IOBase, bytes)): - _content = body + if isinstance(pending_upload_request, (IOBase, bytes)): + _content = pending_upload_request else: - _content = json.dumps(body, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + _content = json.dumps(pending_upload_request, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore _request = build_datasets_pending_upload_request( name=name, @@ -1405,7 +1428,7 @@ def __init__(self, *args, **kwargs) -> None: self._deserialize: Deserializer = input_args.pop(0) if input_args else kwargs.pop("deserializer") @distributed_trace - def list_versions(self, name: str, **kwargs: Any) -> AsyncIterable["_models.Index"]: + def list_versions(self, name: str, **kwargs: Any) -> AsyncItemPaged["_models.Index"]: """List all versions of the given Index. :param name: The name of the resource. Required. @@ -1490,7 +1513,7 @@ async def get_next(next_link=None): return AsyncItemPaged(get_next, extract_data) @distributed_trace - def list(self, **kwargs: Any) -> AsyncIterable["_models.Index"]: + def list(self, **kwargs: Any) -> AsyncItemPaged["_models.Index"]: """List the latest version of each Index. :return: An iterator like instance of Index @@ -1690,7 +1713,7 @@ async def create_or_update( self, name: str, version: str, - body: _models.Index, + index: _models.Index, *, content_type: str = "application/merge-patch+json", **kwargs: Any @@ -1699,10 +1722,10 @@ async def create_or_update( :param name: The name of the resource. Required. :type name: str - :param version: The specific version id of the Index to create or replace. Required. + :param version: The specific version id of the Index to create or update. Required. :type version: str - :param body: The definition of the Index to create or update. Required. - :type body: ~azure.ai.projects.models.Index + :param index: The Index to create or update. Required. + :type index: ~azure.ai.projects.models.Index :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. Default value is "application/merge-patch+json". :paramtype content_type: str @@ -1713,16 +1736,16 @@ async def create_or_update( @overload async def create_or_update( - self, name: str, version: str, body: JSON, *, content_type: str = "application/merge-patch+json", **kwargs: Any + self, name: str, version: str, index: JSON, *, content_type: str = "application/merge-patch+json", **kwargs: Any ) -> _models.Index: """Create a new or update an existing Index with the given version id. :param name: The name of the resource. Required. :type name: str - :param version: The specific version id of the Index to create or replace. Required. + :param version: The specific version id of the Index to create or update. Required. :type version: str - :param body: The definition of the Index to create or update. Required. - :type body: JSON + :param index: The Index to create or update. Required. + :type index: JSON :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. Default value is "application/merge-patch+json". :paramtype content_type: str @@ -1736,7 +1759,7 @@ async def create_or_update( self, name: str, version: str, - body: IO[bytes], + index: IO[bytes], *, content_type: str = "application/merge-patch+json", **kwargs: Any @@ -1745,10 +1768,10 @@ async def create_or_update( :param name: The name of the resource. Required. :type name: str - :param version: The specific version id of the Index to create or replace. Required. + :param version: The specific version id of the Index to create or update. Required. :type version: str - :param body: The definition of the Index to create or update. Required. - :type body: IO[bytes] + :param index: The Index to create or update. Required. + :type index: IO[bytes] :keyword content_type: Body Parameter content-type. Content type parameter for binary body. Default value is "application/merge-patch+json". :paramtype content_type: str @@ -1759,17 +1782,17 @@ async def create_or_update( @distributed_trace_async async def create_or_update( - self, name: str, version: str, body: Union[_models.Index, JSON, IO[bytes]], **kwargs: Any + self, name: str, version: str, index: Union[_models.Index, JSON, IO[bytes]], **kwargs: Any ) -> _models.Index: """Create a new or update an existing Index with the given version id. :param name: The name of the resource. Required. :type name: str - :param version: The specific version id of the Index to create or replace. Required. + :param version: The specific version id of the Index to create or update. Required. :type version: str - :param body: The definition of the Index to create or update. Is one of the following types: - Index, JSON, IO[bytes] Required. - :type body: ~azure.ai.projects.models.Index or JSON or IO[bytes] + :param index: The Index to create or update. Is one of the following types: Index, JSON, + IO[bytes] Required. + :type index: ~azure.ai.projects.models.Index or JSON or IO[bytes] :return: Index. The Index is compatible with MutableMapping :rtype: ~azure.ai.projects.models.Index :raises ~azure.core.exceptions.HttpResponseError: @@ -1790,10 +1813,10 @@ async def create_or_update( content_type = content_type or "application/merge-patch+json" _content = None - if isinstance(body, (IOBase, bytes)): - _content = body + if isinstance(index, (IOBase, bytes)): + _content = index else: - _content = json.dumps(body, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + _content = json.dumps(index, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore _request = build_indexes_create_or_update_request( name=name, @@ -1926,7 +1949,7 @@ def list( model_name: Optional[str] = None, deployment_type: Optional[Union[str, _models.DeploymentType]] = None, **kwargs: Any - ) -> AsyncIterable["_models.Deployment"]: + ) -> AsyncItemPaged["_models.Deployment"]: """List all deployed models in the project. :keyword model_publisher: Model publisher to filter models by. Default value is None. @@ -2110,7 +2133,7 @@ async def get(self, name: str, **kwargs: Any) -> _models.RedTeam: method_added_on="2025-05-15-preview", params_added_on={"2025-05-15-preview": ["api_version", "client_request_id", "accept"]}, ) - def list(self, **kwargs: Any) -> AsyncIterable["_models.RedTeam"]: + def list(self, **kwargs: Any) -> AsyncItemPaged["_models.RedTeam"]: """List a redteam by name. :return: An iterator like instance of RedTeam diff --git a/sdk/ai/azure-ai-projects/azure/ai/projects/aio/operations/_patch_datasets_async.py b/sdk/ai/azure-ai-projects/azure/ai/projects/aio/operations/_patch_datasets_async.py index 1e0950397cb1..ea0d2eb1387b 100644 --- a/sdk/ai/azure-ai-projects/azure/ai/projects/aio/operations/_patch_datasets_async.py +++ b/sdk/ai/azure-ai-projects/azure/ai/projects/aio/operations/_patch_datasets_async.py @@ -51,7 +51,7 @@ async def _create_dataset_and_get_its_container_client( pending_upload_response: PendingUploadResponse = await self.pending_upload( name=name, version=input_version, - body=PendingUploadRequest( + pending_upload_request=PendingUploadRequest( pending_upload_type=PendingUploadType.BLOB_REFERENCE, connection_name=connection_name, ), @@ -133,7 +133,7 @@ async def upload_file( dataset_version = await self.create_or_update( name=name, version=output_version, - body=FileDatasetVersion( + dataset_version=FileDatasetVersion( # See https://learn.microsoft.com/python/api/azure-storage-blob/azure.storage.blob.blobclient?view=azure-python#azure-storage-blob-blobclient-url # Per above doc the ".url" contains SAS token... should this be stripped away? data_uri=data_uri, @@ -217,7 +217,7 @@ async def upload_folder( dataset_version = await self.create_or_update( name=name, version=output_version, - body=FolderDatasetVersion(data_uri=data_uri), + dataset_version=FolderDatasetVersion(data_uri=data_uri), ) return dataset_version diff --git a/sdk/ai/azure-ai-projects/azure/ai/projects/aio/operations/_patch_inference_async.py b/sdk/ai/azure-ai-projects/azure/ai/projects/aio/operations/_patch_inference_async.py index 22f3301844e6..43425f1f120d 100644 --- a/sdk/ai/azure-ai-projects/azure/ai/projects/aio/operations/_patch_inference_async.py +++ b/sdk/ai/azure-ai-projects/azure/ai/projects/aio/operations/_patch_inference_async.py @@ -8,16 +8,14 @@ Follow our quickstart for examples: https://aka.ms/azsdk/python/dpcodegen/python/customize """ import logging -from typing import Optional, TYPE_CHECKING, Any -from urllib.parse import urlparse +from typing import Optional, TYPE_CHECKING from azure.core.tracing.decorator_async import distributed_trace_async -from azure.core.tracing.decorator import distributed_trace - from ...models._models import ( ApiKeyCredentials, EntraIDCredentials, ) from ...models._enums import ConnectionType +from ...operations._patch_inference import _get_aoai_inference_url if TYPE_CHECKING: # pylint: disable=unused-import,ungrouped-imports @@ -40,165 +38,6 @@ class InferenceOperations: def __init__(self, outer_instance: "azure.ai.projects.aio.AIProjectClient") -> None: # type: ignore[name-defined] self._outer_instance = outer_instance - # TODO: Use a common method for both the sync and async operations - @classmethod - def _get_inference_url(cls, input_url: str) -> str: - """ - Converts an input URL in the format: - https:/// - to: - https:///models - - :param input_url: The input endpoint URL used to construct AIProjectClient. - :type input_url: str - - :return: The endpoint URL required to construct inference clients from the azure-ai-inference package. - :rtype: str - """ - parsed = urlparse(input_url) - if parsed.scheme != "https" or not parsed.netloc: - raise ValueError("Invalid endpoint URL format. Must be an https URL with a host.") - new_url = f"https://{parsed.netloc}/models" - return new_url - - # TODO: Use a common method for both the sync and async operations - @classmethod - def _get_aoai_inference_url(cls, input_url: str) -> str: - """ - Converts an input URL in the format: - https:/// - to: - https:// - - :param input_url: The input endpoint URL used to construct AIProjectClient. - :type input_url: str - - :return: The endpoint URL required to construct an AzureOpenAI client from the `openai` package. - :rtype: str - """ - parsed = urlparse(input_url) - if parsed.scheme != "https" or not parsed.netloc: - raise ValueError("Invalid endpoint URL format. Must be an https URL with a host.") - new_url = f"https://{parsed.netloc}" - return new_url - - @distributed_trace - def get_chat_completions_client(self, **kwargs: Any) -> "ChatCompletionsClient": # type: ignore[name-defined] - """Get an authenticated asynchronous ChatCompletionsClient (from the package azure-ai-inference) to use with - AI models deployed to your AI Foundry Project. Keyword arguments are passed to the constructor of - ChatCompletionsClient. - - At least one AI model that supports chat completions must be deployed. - - .. note:: The package `azure-ai-inference` and `aiohttp` must be installed prior to calling this method. - - :return: An authenticated chat completions client. - :rtype: ~azure.ai.inference.aio.ChatCompletionsClient - - :raises ~azure.core.exceptions.ModuleNotFoundError: if the `azure-ai-inference` package - is not installed. - :raises ~azure.core.exceptions.HttpResponseError: - """ - - try: - from azure.ai.inference.aio import ChatCompletionsClient - except ModuleNotFoundError as e: - raise ModuleNotFoundError( - "Azure AI Inference SDK is not installed. Please install it using 'pip install azure-ai-inference'" - ) from e - - endpoint = self._get_inference_url(self._outer_instance._config.endpoint) # pylint: disable=protected-access - - client = ChatCompletionsClient( - endpoint=endpoint, - credential=self._outer_instance._config.credential, # pylint: disable=protected-access - credential_scopes=self._outer_instance._config.credential_scopes, # pylint: disable=protected-access, - user_agent=kwargs.pop( - "user_agent", self._outer_instance._patched_user_agent # pylint: disable=protected-access - ), - **kwargs, - ) - - return client - - @distributed_trace - def get_embeddings_client(self, **kwargs: Any) -> "EmbeddingsClient": # type: ignore[name-defined] - """Get an authenticated asynchronous EmbeddingsClient (from the package azure-ai-inference) to use with - AI models deployed to your AI Foundry Project. Keyword arguments are passed to the constructor of - ChatCompletionsClient. - - At least one AI model that supports text embeddings must be deployed. - - .. note:: The package `azure-ai-inference` and `aiohttp` must be installed prior to calling this method. - - :return: An authenticated Embeddings client. - :rtype: ~azure.ai.inference.aio.EmbeddingsClient - - :raises ~azure.core.exceptions.ModuleNotFoundError: if the `azure-ai-inference` package - is not installed. - :raises ~azure.core.exceptions.HttpResponseError: - """ - - try: - from azure.ai.inference.aio import EmbeddingsClient - except ModuleNotFoundError as e: - raise ModuleNotFoundError( - "Azure AI Inference SDK is not installed. Please install it using 'pip install azure-ai-inference'" - ) from e - - endpoint = self._get_inference_url(self._outer_instance._config.endpoint) # pylint: disable=protected-access - - client = EmbeddingsClient( - endpoint=endpoint, - credential=self._outer_instance._config.credential, # pylint: disable=protected-access - credential_scopes=self._outer_instance._config.credential_scopes, # pylint: disable=protected-access, - user_agent=kwargs.pop( - "user_agent", self._outer_instance._patched_user_agent # pylint: disable=protected-access - ), - **kwargs, - ) - - return client - - @distributed_trace - def get_image_embeddings_client(self, **kwargs: Any) -> "ImageEmbeddingsClient": # type: ignore[name-defined] - """Get an authenticated asynchronous ImageEmbeddingsClient (from the package azure-ai-inference) to use with - AI models deployed to your AI Foundry Project. Keyword arguments are passed to the constructor of - ChatCompletionsClient. - - At least one AI model that supports image embeddings must be deployed. - - .. note:: The package `azure-ai-inference` and `aiohttp` must be installed prior to calling this method. - - :return: An authenticated Image Embeddings client. - :rtype: ~azure.ai.inference.aio.ImageEmbeddingsClient - - :raises ~azure.core.exceptions.ModuleNotFoundError: if the `azure-ai-inference` package - is not installed. - :raises ~azure.core.exceptions.HttpResponseError: - """ - - try: - from azure.ai.inference.aio import ImageEmbeddingsClient - except ModuleNotFoundError as e: - raise ModuleNotFoundError( - "Azure AI Inference SDK is not installed. Please install it using 'pip install azure-ai-inference'" - ) from e - - endpoint = self._get_inference_url(self._outer_instance._config.endpoint) # pylint: disable=protected-access - - client = ImageEmbeddingsClient( - endpoint=endpoint, - credential=self._outer_instance._config.credential, # pylint: disable=protected-access - credential_scopes=self._outer_instance._config.credential_scopes, # pylint: disable=protected-access, - user_agent=kwargs.pop( - "user_agent", self._outer_instance._patched_user_agent # pylint: disable=protected-access - ), - **kwargs, - ) - - return client - @distributed_trace_async async def get_azure_openai_client( self, *, api_version: Optional[str] = None, connection_name: Optional[str] = None, **kwargs @@ -300,7 +139,7 @@ async def get_azure_openai_client( "azure.identity package not installed. Please install it using 'pip install azure.identity'" ) from e - azure_endpoint = self._get_aoai_inference_url( + azure_endpoint = _get_aoai_inference_url( self._outer_instance._config.endpoint # pylint: disable=protected-access ) diff --git a/sdk/ai/azure-ai-projects/azure/ai/projects/aio/operations/_patch_telemetry_async.py b/sdk/ai/azure-ai-projects/azure/ai/projects/aio/operations/_patch_telemetry_async.py index 70637c9eb4de..9cf626627262 100644 --- a/sdk/ai/azure-ai-projects/azure/ai/projects/aio/operations/_patch_telemetry_async.py +++ b/sdk/ai/azure-ai-projects/azure/ai/projects/aio/operations/_patch_telemetry_async.py @@ -33,6 +33,7 @@ class TelemetryOperations: def __init__(self, outer_instance: "azure.ai.projects.aio.AIProjectClient") -> None: # type: ignore[name-defined] self._outer_instance = outer_instance + # TODO: Give a more detailed name to this method. @distributed_trace_async async def get_connection_string(self) -> str: """Get the Application Insights connection string associated with the Project's Application Insights resource. @@ -44,6 +45,7 @@ async def get_connection_string(self) -> str: """ if not self._connection_string: + # TODO # TODO: Two REST APIs calls can be replaced by one if we have had REST API for get_with_credentials(connection_type=ConnectionType.APPLICATION_INSIGHTS) # Returns an empty Iterable if no connections exits. connections: AsyncIterable[Connection] = self._outer_instance.connections.list( diff --git a/sdk/ai/azure-ai-projects/azure/ai/projects/operations/_operations.py b/sdk/ai/azure-ai-projects/azure/ai/projects/operations/_operations.py index de607445c337..0992676f7796 100644 --- a/sdk/ai/azure-ai-projects/azure/ai/projects/operations/_operations.py +++ b/sdk/ai/azure-ai-projects/azure/ai/projects/operations/_operations.py @@ -1,4 +1,4 @@ -# pylint: disable=too-many-lines +# pylint: disable=line-too-long,useless-suppression,too-many-lines # coding=utf-8 # -------------------------------------------------------------------------- # Copyright (c) Microsoft Corporation. All rights reserved. @@ -9,7 +9,7 @@ from collections.abc import MutableMapping from io import IOBase import json -from typing import Any, Callable, Dict, IO, Iterable, List, Optional, TypeVar, Union, overload +from typing import Any, Callable, Dict, IO, List, Optional, TypeVar, Union, overload import urllib.parse from azure.core import PipelineClient @@ -633,7 +633,7 @@ class ConnectionsOperations: :attr:`connections` attribute. """ - def __init__(self, *args, **kwargs): + def __init__(self, *args, **kwargs) -> None: input_args = list(args) self._client: PipelineClient = input_args.pop(0) if input_args else kwargs.pop("client") self._config: AIProjectClientConfiguration = input_args.pop(0) if input_args else kwargs.pop("config") @@ -777,7 +777,7 @@ def list( connection_type: Optional[Union[str, _models.ConnectionType]] = None, default_connection: Optional[bool] = None, **kwargs: Any - ) -> Iterable["_models.Connection"]: + ) -> ItemPaged["_models.Connection"]: """List all connections in the project, without populating connection credentials. :keyword connection_type: List connections of this specific type. Known values are: @@ -878,7 +878,7 @@ class EvaluationsOperations: :attr:`evaluations` attribute. """ - def __init__(self, *args, **kwargs): + def __init__(self, *args, **kwargs) -> None: input_args = list(args) self._client: PipelineClient = input_args.pop(0) if input_args else kwargs.pop("client") self._config: AIProjectClientConfiguration = input_args.pop(0) if input_args else kwargs.pop("config") @@ -959,7 +959,7 @@ def get(self, name: str, **kwargs: Any) -> _models.Evaluation: method_added_on="2025-05-15-preview", params_added_on={"2025-05-15-preview": ["api_version", "client_request_id", "accept"]}, ) - def list(self, **kwargs: Any) -> Iterable["_models.Evaluation"]: + def list(self, **kwargs: Any) -> ItemPaged["_models.Evaluation"]: """List evaluation runs. :return: An iterator like instance of Evaluation @@ -1295,7 +1295,7 @@ class DatasetsOperations: :attr:`datasets` attribute. """ - def __init__(self, *args, **kwargs): + def __init__(self, *args, **kwargs) -> None: input_args = list(args) self._client: PipelineClient = input_args.pop(0) if input_args else kwargs.pop("client") self._config: AIProjectClientConfiguration = input_args.pop(0) if input_args else kwargs.pop("config") @@ -1303,7 +1303,7 @@ def __init__(self, *args, **kwargs): self._deserialize: Deserializer = input_args.pop(0) if input_args else kwargs.pop("deserializer") @distributed_trace - def list_versions(self, name: str, **kwargs: Any) -> Iterable["_models.DatasetVersion"]: + def list_versions(self, name: str, **kwargs: Any) -> ItemPaged["_models.DatasetVersion"]: """List all versions of the given DatasetVersion. :param name: The name of the resource. Required. @@ -1388,7 +1388,7 @@ def get_next(next_link=None): return ItemPaged(get_next, extract_data) @distributed_trace - def list(self, **kwargs: Any) -> Iterable["_models.DatasetVersion"]: + def list(self, **kwargs: Any) -> ItemPaged["_models.DatasetVersion"]: """List the latest version of each DatasetVersion. :return: An iterator like instance of DatasetVersion @@ -1576,7 +1576,7 @@ def delete(self, name: str, version: str, **kwargs: Any) -> None: # pylint: dis response = pipeline_response.http_response - if response.status_code not in [204]: + if response.status_code not in [204, 200]: map_error(status_code=response.status_code, response=response, error_map=error_map) raise HttpResponseError(response=response) @@ -1588,7 +1588,7 @@ def create_or_update( self, name: str, version: str, - body: _models.DatasetVersion, + dataset_version: _models.DatasetVersion, *, content_type: str = "application/merge-patch+json", **kwargs: Any @@ -1597,10 +1597,10 @@ def create_or_update( :param name: The name of the resource. Required. :type name: str - :param version: The specific version id of the DatasetVersion to create or replace. Required. + :param version: The specific version id of the DatasetVersion to create or update. Required. :type version: str - :param body: The definition of the DatasetVersion to create or update. Required. - :type body: ~azure.ai.projects.models.DatasetVersion + :param dataset_version: The DatasetVersion to create or update. Required. + :type dataset_version: ~azure.ai.projects.models.DatasetVersion :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. Default value is "application/merge-patch+json". :paramtype content_type: str @@ -1611,16 +1611,22 @@ def create_or_update( @overload def create_or_update( - self, name: str, version: str, body: JSON, *, content_type: str = "application/merge-patch+json", **kwargs: Any + self, + name: str, + version: str, + dataset_version: JSON, + *, + content_type: str = "application/merge-patch+json", + **kwargs: Any ) -> _models.DatasetVersion: """Create a new or update an existing DatasetVersion with the given version id. :param name: The name of the resource. Required. :type name: str - :param version: The specific version id of the DatasetVersion to create or replace. Required. + :param version: The specific version id of the DatasetVersion to create or update. Required. :type version: str - :param body: The definition of the DatasetVersion to create or update. Required. - :type body: JSON + :param dataset_version: The DatasetVersion to create or update. Required. + :type dataset_version: JSON :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. Default value is "application/merge-patch+json". :paramtype content_type: str @@ -1634,7 +1640,7 @@ def create_or_update( self, name: str, version: str, - body: IO[bytes], + dataset_version: IO[bytes], *, content_type: str = "application/merge-patch+json", **kwargs: Any @@ -1643,10 +1649,10 @@ def create_or_update( :param name: The name of the resource. Required. :type name: str - :param version: The specific version id of the DatasetVersion to create or replace. Required. + :param version: The specific version id of the DatasetVersion to create or update. Required. :type version: str - :param body: The definition of the DatasetVersion to create or update. Required. - :type body: IO[bytes] + :param dataset_version: The DatasetVersion to create or update. Required. + :type dataset_version: IO[bytes] :keyword content_type: Body Parameter content-type. Content type parameter for binary body. Default value is "application/merge-patch+json". :paramtype content_type: str @@ -1657,17 +1663,17 @@ def create_or_update( @distributed_trace def create_or_update( - self, name: str, version: str, body: Union[_models.DatasetVersion, JSON, IO[bytes]], **kwargs: Any + self, name: str, version: str, dataset_version: Union[_models.DatasetVersion, JSON, IO[bytes]], **kwargs: Any ) -> _models.DatasetVersion: """Create a new or update an existing DatasetVersion with the given version id. :param name: The name of the resource. Required. :type name: str - :param version: The specific version id of the DatasetVersion to create or replace. Required. + :param version: The specific version id of the DatasetVersion to create or update. Required. :type version: str - :param body: The definition of the DatasetVersion to create or update. Is one of the following - types: DatasetVersion, JSON, IO[bytes] Required. - :type body: ~azure.ai.projects.models.DatasetVersion or JSON or IO[bytes] + :param dataset_version: The DatasetVersion to create or update. Is one of the following types: + DatasetVersion, JSON, IO[bytes] Required. + :type dataset_version: ~azure.ai.projects.models.DatasetVersion or JSON or IO[bytes] :return: DatasetVersion. The DatasetVersion is compatible with MutableMapping :rtype: ~azure.ai.projects.models.DatasetVersion :raises ~azure.core.exceptions.HttpResponseError: @@ -1688,10 +1694,10 @@ def create_or_update( content_type = content_type or "application/merge-patch+json" _content = None - if isinstance(body, (IOBase, bytes)): - _content = body + if isinstance(dataset_version, (IOBase, bytes)): + _content = dataset_version else: - _content = json.dumps(body, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + _content = json.dumps(dataset_version, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore _request = build_datasets_create_or_update_request( name=name, @@ -1738,7 +1744,7 @@ def pending_upload( self, name: str, version: str, - body: _models.PendingUploadRequest, + pending_upload_request: _models.PendingUploadRequest, *, content_type: str = "application/json", **kwargs: Any @@ -1749,8 +1755,8 @@ def pending_upload( :type name: str :param version: The specific version id of the DatasetVersion to operate on. Required. :type version: str - :param body: Parameters for the action. Required. - :type body: ~azure.ai.projects.models.PendingUploadRequest + :param pending_upload_request: The pending upload request parameters. Required. + :type pending_upload_request: ~azure.ai.projects.models.PendingUploadRequest :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. Default value is "application/json". :paramtype content_type: str @@ -1761,7 +1767,13 @@ def pending_upload( @overload def pending_upload( - self, name: str, version: str, body: JSON, *, content_type: str = "application/json", **kwargs: Any + self, + name: str, + version: str, + pending_upload_request: JSON, + *, + content_type: str = "application/json", + **kwargs: Any ) -> _models.PendingUploadResponse: """Start a new or get an existing pending upload of a dataset for a specific version. @@ -1769,8 +1781,8 @@ def pending_upload( :type name: str :param version: The specific version id of the DatasetVersion to operate on. Required. :type version: str - :param body: Parameters for the action. Required. - :type body: JSON + :param pending_upload_request: The pending upload request parameters. Required. + :type pending_upload_request: JSON :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. Default value is "application/json". :paramtype content_type: str @@ -1781,7 +1793,13 @@ def pending_upload( @overload def pending_upload( - self, name: str, version: str, body: IO[bytes], *, content_type: str = "application/json", **kwargs: Any + self, + name: str, + version: str, + pending_upload_request: IO[bytes], + *, + content_type: str = "application/json", + **kwargs: Any ) -> _models.PendingUploadResponse: """Start a new or get an existing pending upload of a dataset for a specific version. @@ -1789,8 +1807,8 @@ def pending_upload( :type name: str :param version: The specific version id of the DatasetVersion to operate on. Required. :type version: str - :param body: Parameters for the action. Required. - :type body: IO[bytes] + :param pending_upload_request: The pending upload request parameters. Required. + :type pending_upload_request: IO[bytes] :keyword content_type: Body Parameter content-type. Content type parameter for binary body. Default value is "application/json". :paramtype content_type: str @@ -1801,7 +1819,11 @@ def pending_upload( @distributed_trace def pending_upload( - self, name: str, version: str, body: Union[_models.PendingUploadRequest, JSON, IO[bytes]], **kwargs: Any + self, + name: str, + version: str, + pending_upload_request: Union[_models.PendingUploadRequest, JSON, IO[bytes]], + **kwargs: Any ) -> _models.PendingUploadResponse: """Start a new or get an existing pending upload of a dataset for a specific version. @@ -1809,9 +1831,10 @@ def pending_upload( :type name: str :param version: The specific version id of the DatasetVersion to operate on. Required. :type version: str - :param body: Parameters for the action. Is one of the following types: PendingUploadRequest, - JSON, IO[bytes] Required. - :type body: ~azure.ai.projects.models.PendingUploadRequest or JSON or IO[bytes] + :param pending_upload_request: The pending upload request parameters. Is one of the following + types: PendingUploadRequest, JSON, IO[bytes] Required. + :type pending_upload_request: ~azure.ai.projects.models.PendingUploadRequest or JSON or + IO[bytes] :return: PendingUploadResponse. The PendingUploadResponse is compatible with MutableMapping :rtype: ~azure.ai.projects.models.PendingUploadResponse :raises ~azure.core.exceptions.HttpResponseError: @@ -1832,10 +1855,10 @@ def pending_upload( content_type = content_type or "application/json" _content = None - if isinstance(body, (IOBase, bytes)): - _content = body + if isinstance(pending_upload_request, (IOBase, bytes)): + _content = pending_upload_request else: - _content = json.dumps(body, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + _content = json.dumps(pending_upload_request, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore _request = build_datasets_pending_upload_request( name=name, @@ -1951,7 +1974,7 @@ class IndexesOperations: :attr:`indexes` attribute. """ - def __init__(self, *args, **kwargs): + def __init__(self, *args, **kwargs) -> None: input_args = list(args) self._client: PipelineClient = input_args.pop(0) if input_args else kwargs.pop("client") self._config: AIProjectClientConfiguration = input_args.pop(0) if input_args else kwargs.pop("config") @@ -1959,7 +1982,7 @@ def __init__(self, *args, **kwargs): self._deserialize: Deserializer = input_args.pop(0) if input_args else kwargs.pop("deserializer") @distributed_trace - def list_versions(self, name: str, **kwargs: Any) -> Iterable["_models.Index"]: + def list_versions(self, name: str, **kwargs: Any) -> ItemPaged["_models.Index"]: """List all versions of the given Index. :param name: The name of the resource. Required. @@ -2044,7 +2067,7 @@ def get_next(next_link=None): return ItemPaged(get_next, extract_data) @distributed_trace - def list(self, **kwargs: Any) -> Iterable["_models.Index"]: + def list(self, **kwargs: Any) -> ItemPaged["_models.Index"]: """List the latest version of each Index. :return: An iterator like instance of Index @@ -2244,7 +2267,7 @@ def create_or_update( self, name: str, version: str, - body: _models.Index, + index: _models.Index, *, content_type: str = "application/merge-patch+json", **kwargs: Any @@ -2253,10 +2276,10 @@ def create_or_update( :param name: The name of the resource. Required. :type name: str - :param version: The specific version id of the Index to create or replace. Required. + :param version: The specific version id of the Index to create or update. Required. :type version: str - :param body: The definition of the Index to create or update. Required. - :type body: ~azure.ai.projects.models.Index + :param index: The Index to create or update. Required. + :type index: ~azure.ai.projects.models.Index :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. Default value is "application/merge-patch+json". :paramtype content_type: str @@ -2267,16 +2290,16 @@ def create_or_update( @overload def create_or_update( - self, name: str, version: str, body: JSON, *, content_type: str = "application/merge-patch+json", **kwargs: Any + self, name: str, version: str, index: JSON, *, content_type: str = "application/merge-patch+json", **kwargs: Any ) -> _models.Index: """Create a new or update an existing Index with the given version id. :param name: The name of the resource. Required. :type name: str - :param version: The specific version id of the Index to create or replace. Required. + :param version: The specific version id of the Index to create or update. Required. :type version: str - :param body: The definition of the Index to create or update. Required. - :type body: JSON + :param index: The Index to create or update. Required. + :type index: JSON :keyword content_type: Body Parameter content-type. Content type parameter for JSON body. Default value is "application/merge-patch+json". :paramtype content_type: str @@ -2290,7 +2313,7 @@ def create_or_update( self, name: str, version: str, - body: IO[bytes], + index: IO[bytes], *, content_type: str = "application/merge-patch+json", **kwargs: Any @@ -2299,10 +2322,10 @@ def create_or_update( :param name: The name of the resource. Required. :type name: str - :param version: The specific version id of the Index to create or replace. Required. + :param version: The specific version id of the Index to create or update. Required. :type version: str - :param body: The definition of the Index to create or update. Required. - :type body: IO[bytes] + :param index: The Index to create or update. Required. + :type index: IO[bytes] :keyword content_type: Body Parameter content-type. Content type parameter for binary body. Default value is "application/merge-patch+json". :paramtype content_type: str @@ -2313,17 +2336,17 @@ def create_or_update( @distributed_trace def create_or_update( - self, name: str, version: str, body: Union[_models.Index, JSON, IO[bytes]], **kwargs: Any + self, name: str, version: str, index: Union[_models.Index, JSON, IO[bytes]], **kwargs: Any ) -> _models.Index: """Create a new or update an existing Index with the given version id. :param name: The name of the resource. Required. :type name: str - :param version: The specific version id of the Index to create or replace. Required. + :param version: The specific version id of the Index to create or update. Required. :type version: str - :param body: The definition of the Index to create or update. Is one of the following types: - Index, JSON, IO[bytes] Required. - :type body: ~azure.ai.projects.models.Index or JSON or IO[bytes] + :param index: The Index to create or update. Is one of the following types: Index, JSON, + IO[bytes] Required. + :type index: ~azure.ai.projects.models.Index or JSON or IO[bytes] :return: Index. The Index is compatible with MutableMapping :rtype: ~azure.ai.projects.models.Index :raises ~azure.core.exceptions.HttpResponseError: @@ -2344,10 +2367,10 @@ def create_or_update( content_type = content_type or "application/merge-patch+json" _content = None - if isinstance(body, (IOBase, bytes)): - _content = body + if isinstance(index, (IOBase, bytes)): + _content = index else: - _content = json.dumps(body, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore + _content = json.dumps(index, cls=SdkJSONEncoder, exclude_readonly=True) # type: ignore _request = build_indexes_create_or_update_request( name=name, @@ -2400,7 +2423,7 @@ class DeploymentsOperations: :attr:`deployments` attribute. """ - def __init__(self, *args, **kwargs): + def __init__(self, *args, **kwargs) -> None: input_args = list(args) self._client: PipelineClient = input_args.pop(0) if input_args else kwargs.pop("client") self._config: AIProjectClientConfiguration = input_args.pop(0) if input_args else kwargs.pop("config") @@ -2480,7 +2503,7 @@ def list( model_name: Optional[str] = None, deployment_type: Optional[Union[str, _models.DeploymentType]] = None, **kwargs: Any - ) -> Iterable["_models.Deployment"]: + ) -> ItemPaged["_models.Deployment"]: """List all deployed models in the project. :keyword model_publisher: Model publisher to filter models by. Default value is None. @@ -2583,7 +2606,7 @@ class RedTeamsOperations: :attr:`red_teams` attribute. """ - def __init__(self, *args, **kwargs): + def __init__(self, *args, **kwargs) -> None: input_args = list(args) self._client: PipelineClient = input_args.pop(0) if input_args else kwargs.pop("client") self._config: AIProjectClientConfiguration = input_args.pop(0) if input_args else kwargs.pop("config") @@ -2664,7 +2687,7 @@ def get(self, name: str, **kwargs: Any) -> _models.RedTeam: method_added_on="2025-05-15-preview", params_added_on={"2025-05-15-preview": ["api_version", "client_request_id", "accept"]}, ) - def list(self, **kwargs: Any) -> Iterable["_models.RedTeam"]: + def list(self, **kwargs: Any) -> ItemPaged["_models.RedTeam"]: """List a redteam by name. :return: An iterator like instance of RedTeam diff --git a/sdk/ai/azure-ai-projects/azure/ai/projects/operations/_patch_datasets.py b/sdk/ai/azure-ai-projects/azure/ai/projects/operations/_patch_datasets.py index 4cacc6e286dc..22b8b941b21b 100644 --- a/sdk/ai/azure-ai-projects/azure/ai/projects/operations/_patch_datasets.py +++ b/sdk/ai/azure-ai-projects/azure/ai/projects/operations/_patch_datasets.py @@ -50,7 +50,7 @@ def _create_dataset_and_get_its_container_client( pending_upload_response: PendingUploadResponse = self.pending_upload( name=name, version=input_version, - body=PendingUploadRequest( + pending_upload_request=PendingUploadRequest( pending_upload_type=PendingUploadType.BLOB_REFERENCE, connection_name=connection_name, ), @@ -130,9 +130,8 @@ def upload_file( dataset_version = self.create_or_update( name=name, version=output_version, - body=FileDatasetVersion( + dataset_version=FileDatasetVersion( # See https://learn.microsoft.com/python/api/azure-storage-blob/azure.storage.blob.blobclient?view=azure-python#azure-storage-blob-blobclient-url - # Per above doc the ".url" contains SAS token... should this be stripped away? data_uri=data_uri, ), ) @@ -216,7 +215,7 @@ def upload_folder( dataset_version = self.create_or_update( name=name, version=output_version, - body=FolderDatasetVersion(data_uri=data_uri), + dataset_version=FolderDatasetVersion(data_uri=data_uri), ) return dataset_version diff --git a/sdk/ai/azure-ai-projects/azure/ai/projects/operations/_patch_inference.py b/sdk/ai/azure-ai-projects/azure/ai/projects/operations/_patch_inference.py index 4e5739d7114d..b5b0117c2847 100644 --- a/sdk/ai/azure-ai-projects/azure/ai/projects/operations/_patch_inference.py +++ b/sdk/ai/azure-ai-projects/azure/ai/projects/operations/_patch_inference.py @@ -8,7 +8,7 @@ Follow our quickstart for examples: https://aka.ms/azsdk/python/dpcodegen/python/customize """ import logging -from typing import Optional, TYPE_CHECKING, Any +from typing import Optional, TYPE_CHECKING from urllib.parse import urlparse from azure.core.tracing.decorator import distributed_trace from ..models._models import ApiKeyCredentials, EntraIDCredentials @@ -17,11 +17,30 @@ if TYPE_CHECKING: # pylint: disable=unused-import,ungrouped-imports from openai import AzureOpenAI - from azure.ai.inference import ChatCompletionsClient, EmbeddingsClient, ImageEmbeddingsClient logger = logging.getLogger(__name__) +def _get_aoai_inference_url(input_url: str) -> str: + """ + Converts an input URL in the format: + https:/// + to: + https:// + + :param input_url: The input endpoint URL used to construct AIProjectClient. + :type input_url: str + + :return: The endpoint URL required to construct an AzureOpenAI client from the `openai` package. + :rtype: str + """ + parsed = urlparse(input_url) + if parsed.scheme != "https" or not parsed.netloc: + raise ValueError("Invalid endpoint URL format. Must be an https URL with a host.") + new_url = f"https://{parsed.netloc}" + return new_url + + class InferenceOperations: """ .. warning:: @@ -35,163 +54,6 @@ class InferenceOperations: def __init__(self, outer_instance: "azure.ai.projects.AIProjectClient") -> None: # type: ignore[name-defined] self._outer_instance = outer_instance - @classmethod - def _get_inference_url(cls, input_url: str) -> str: - """ - Converts an input URL in the format: - https:/// - to: - https:///models - - :param input_url: The input endpoint URL used to construct AIProjectClient. - :type input_url: str - - :return: The endpoint URL required to construct inference clients from the `azure-ai-inference` package. - :rtype: str - """ - parsed = urlparse(input_url) - if parsed.scheme != "https" or not parsed.netloc: - raise ValueError("Invalid endpoint URL format. Must be an https URL with a host.") - new_url = f"https://{parsed.netloc}/models" - return new_url - - @classmethod - def _get_aoai_inference_url(cls, input_url: str) -> str: - """ - Converts an input URL in the format: - https:/// - to: - https:// - - :param input_url: The input endpoint URL used to construct AIProjectClient. - :type input_url: str - - :return: The endpoint URL required to construct an AzureOpenAI client from the `openai` package. - :rtype: str - """ - parsed = urlparse(input_url) - if parsed.scheme != "https" or not parsed.netloc: - raise ValueError("Invalid endpoint URL format. Must be an https URL with a host.") - new_url = f"https://{parsed.netloc}" - return new_url - - @distributed_trace - def get_chat_completions_client(self, **kwargs: Any) -> "ChatCompletionsClient": # type: ignore[name-defined] - """Get an authenticated ChatCompletionsClient (from the package azure-ai-inference) to use with - AI models deployed to your AI Foundry Project. Keyword arguments are passed to the constructor of - ChatCompletionsClient. - - At least one AI model that supports chat completions must be deployed. - - .. note:: The package `azure-ai-inference` must be installed prior to calling this method. - - :return: An authenticated chat completions client. - :rtype: ~azure.ai.inference.ChatCompletionsClient - - :raises ~azure.core.exceptions.ModuleNotFoundError: if the `azure-ai-inference` package - is not installed. - :raises ~azure.core.exceptions.HttpResponseError: - """ - - try: - from azure.ai.inference import ChatCompletionsClient - except ModuleNotFoundError as e: - raise ModuleNotFoundError( - "Azure AI Inference SDK is not installed. Please install it using 'pip install azure-ai-inference'" - ) from e - - endpoint = self._get_inference_url(self._outer_instance._config.endpoint) # pylint: disable=protected-access - - client = ChatCompletionsClient( - endpoint=endpoint, - credential=self._outer_instance._config.credential, # pylint: disable=protected-access - credential_scopes=self._outer_instance._config.credential_scopes, # pylint: disable=protected-access - user_agent=kwargs.pop( - "user_agent", self._outer_instance._patched_user_agent # pylint: disable=protected-access - ), - **kwargs, - ) - - return client - - @distributed_trace - def get_embeddings_client(self, **kwargs: Any) -> "EmbeddingsClient": # type: ignore[name-defined] - """Get an authenticated EmbeddingsClient (from the package azure-ai-inference) to use with - AI models deployed to your AI Foundry Project. Keyword arguments are passed to the constructor of - ChatCompletionsClient. - - At least one AI model that supports text embeddings must be deployed. - - .. note:: The package `azure-ai-inference` must be installed prior to calling this method. - - :return: An authenticated Embeddings client. - :rtype: ~azure.ai.inference.EmbeddingsClient - - :raises ~azure.core.exceptions.ModuleNotFoundError: if the `azure-ai-inference` package - is not installed. - :raises ~azure.core.exceptions.HttpResponseError: - """ - - try: - from azure.ai.inference import EmbeddingsClient - except ModuleNotFoundError as e: - raise ModuleNotFoundError( - "Azure AI Inference SDK is not installed. Please install it using 'pip install azure-ai-inference'" - ) from e - - endpoint = self._get_inference_url(self._outer_instance._config.endpoint) # pylint: disable=protected-access - - client = EmbeddingsClient( - endpoint=endpoint, - credential=self._outer_instance._config.credential, # pylint: disable=protected-access - credential_scopes=self._outer_instance._config.credential_scopes, # pylint: disable=protected-access, - user_agent=kwargs.pop( - "user_agent", self._outer_instance._patched_user_agent # pylint: disable=protected-access - ), - **kwargs, - ) - - return client - - @distributed_trace - def get_image_embeddings_client(self, **kwargs: Any) -> "ImageEmbeddingsClient": # type: ignore[name-defined] - """Get an authenticated ImageEmbeddingsClient (from the package azure-ai-inference) to use with - AI models deployed to your AI Foundry Project. Keyword arguments are passed to the constructor of - ChatCompletionsClient. - - At least one AI model that supports image embeddings must be deployed. - - .. note:: The package `azure-ai-inference` must be installed prior to calling this method. - - :return: An authenticated Image Embeddings client. - :rtype: ~azure.ai.inference.ImageEmbeddingsClient - - :raises ~azure.core.exceptions.ModuleNotFoundError: if the `azure-ai-inference` package - is not installed. - :raises ~azure.core.exceptions.HttpResponseError: - """ - - try: - from azure.ai.inference import ImageEmbeddingsClient - except ModuleNotFoundError as e: - raise ModuleNotFoundError( - "Azure AI Inference SDK is not installed. Please install it using 'pip install azure-ai-inference'" - ) from e - - endpoint = self._get_inference_url(self._outer_instance._config.endpoint) # pylint: disable=protected-access - - client = ImageEmbeddingsClient( - endpoint=endpoint, - credential=self._outer_instance._config.credential, # pylint: disable=protected-access - credential_scopes=self._outer_instance._config.credential_scopes, # pylint: disable=protected-access, - user_agent=kwargs.pop( - "user_agent", self._outer_instance._patched_user_agent # pylint: disable=protected-access - ), - **kwargs, - ) - - return client - @distributed_trace def get_azure_openai_client( self, *, api_version: Optional[str] = None, connection_name: Optional[str] = None, **kwargs @@ -291,7 +153,7 @@ def get_azure_openai_client( "azure.identity package not installed. Please install it using 'pip install azure.identity'" ) from e - azure_endpoint = self._get_aoai_inference_url( + azure_endpoint = _get_aoai_inference_url( self._outer_instance._config.endpoint # pylint: disable=protected-access ) diff --git a/sdk/ai/azure-ai-projects/azure/ai/projects/operations/_patch_telemetry.py b/sdk/ai/azure-ai-projects/azure/ai/projects/operations/_patch_telemetry.py index 2d95fc99cb6a..1dfce736fe5d 100644 --- a/sdk/ai/azure-ai-projects/azure/ai/projects/operations/_patch_telemetry.py +++ b/sdk/ai/azure-ai-projects/azure/ai/projects/operations/_patch_telemetry.py @@ -29,6 +29,7 @@ class TelemetryOperations: def __init__(self, outer_instance: "azure.ai.projects.AIProjectClient") -> None: # type: ignore[name-defined] self._outer_instance = outer_instance + # TODO: Give a more detailed name to this method. @distributed_trace def get_connection_string(self) -> str: """Get the Application Insights connection string associated with the Project's Application Insights resource. diff --git a/sdk/ai/azure-ai-projects/azure_ai_projects_tests.template.env b/sdk/ai/azure-ai-projects/azure_ai_projects_tests.template.env new file mode 100644 index 000000000000..c34a3dda4a7f --- /dev/null +++ b/sdk/ai/azure-ai-projects/azure_ai_projects_tests.template.env @@ -0,0 +1,15 @@ +# +# Environment variables that define secrets required for running tests. +# +# All values should be empty by default in this template. +# +# To run tests locally on your device: +# 1. Rename the file to azure_ai_projects_tests.env +# 2. Fill in the values for the environment variables below (do not commit these changes to the repository!) +# 3. Run the test (`pytest`) +# + +# Project endpoint has the format: +# `https://.services.ai.azure.com/api/projects/` +AZURE_AI_PROJECTS_TESTS_PROJECT_ENDPOINT= + diff --git a/sdk/ai/azure-ai-projects/cspell.json b/sdk/ai/azure-ai-projects/cspell.json index 71bd8a696481..ed393add5d13 100644 --- a/sdk/ai/azure-ai-projects/cspell.json +++ b/sdk/ai/azure-ai-projects/cspell.json @@ -12,6 +12,7 @@ "getconnectionwithcredentials", "quantitive", "balapvbyostoragecanary", + "fspath", ], "ignorePaths": [ ] diff --git a/sdk/ai/azure-ai-projects/generated_tests/conftest.py b/sdk/ai/azure-ai-projects/generated_tests/conftest.py deleted file mode 100644 index dd8e527abab1..000000000000 --- a/sdk/ai/azure-ai-projects/generated_tests/conftest.py +++ /dev/null @@ -1,35 +0,0 @@ -# coding=utf-8 -# -------------------------------------------------------------------------- -# Copyright (c) Microsoft Corporation. All rights reserved. -# Licensed under the MIT License. See License.txt in the project root for license information. -# Code generated by Microsoft (R) Python Code Generator. -# Changes may cause incorrect behavior and will be lost if the code is regenerated. -# -------------------------------------------------------------------------- -import os -import pytest -from dotenv import load_dotenv -from devtools_testutils import ( - test_proxy, - add_general_regex_sanitizer, - add_body_key_sanitizer, - add_header_regex_sanitizer, -) - -load_dotenv() - - -# For security, please avoid record sensitive identity information in recordings -@pytest.fixture(scope="session", autouse=True) -def add_sanitizers(test_proxy): - aiproject_subscription_id = os.environ.get("AIPROJECT_SUBSCRIPTION_ID", "00000000-0000-0000-0000-000000000000") - aiproject_tenant_id = os.environ.get("AIPROJECT_TENANT_ID", "00000000-0000-0000-0000-000000000000") - aiproject_client_id = os.environ.get("AIPROJECT_CLIENT_ID", "00000000-0000-0000-0000-000000000000") - aiproject_client_secret = os.environ.get("AIPROJECT_CLIENT_SECRET", "00000000-0000-0000-0000-000000000000") - add_general_regex_sanitizer(regex=aiproject_subscription_id, value="00000000-0000-0000-0000-000000000000") - add_general_regex_sanitizer(regex=aiproject_tenant_id, value="00000000-0000-0000-0000-000000000000") - add_general_regex_sanitizer(regex=aiproject_client_id, value="00000000-0000-0000-0000-000000000000") - add_general_regex_sanitizer(regex=aiproject_client_secret, value="00000000-0000-0000-0000-000000000000") - - add_header_regex_sanitizer(key="Set-Cookie", value="[set-cookie;]") - add_header_regex_sanitizer(key="Cookie", value="cookie;") - add_body_key_sanitizer(json_path="$..access_token", value="access_token") diff --git a/sdk/ai/azure-ai-projects/generated_tests/test_ai_project_connections_operations.py b/sdk/ai/azure-ai-projects/generated_tests/test_ai_project_connections_operations.py deleted file mode 100644 index d93e0e240cca..000000000000 --- a/sdk/ai/azure-ai-projects/generated_tests/test_ai_project_connections_operations.py +++ /dev/null @@ -1,22 +0,0 @@ -# coding=utf-8 -# -------------------------------------------------------------------------- -# Copyright (c) Microsoft Corporation. All rights reserved. -# Licensed under the MIT License. See License.txt in the project root for license information. -# Code generated by Microsoft (R) Python Code Generator. -# Changes may cause incorrect behavior and will be lost if the code is regenerated. -# -------------------------------------------------------------------------- -import pytest -from devtools_testutils import recorded_by_proxy -from testpreparer import AIProjectClientTestBase, AIProjectPreparer - - -@pytest.mark.skip("you may need to update the auto-generated test case before run it") -class TestAIProjectConnectionsOperations(AIProjectClientTestBase): - @AIProjectPreparer() - @recorded_by_proxy - def test_connections_list(self, aiproject_endpoint): - client = self.create_client(endpoint=aiproject_endpoint) - response = client.connections.list() - result = [r for r in response] - # please add some check logic here by yourself - # ... diff --git a/sdk/ai/azure-ai-projects/generated_tests/test_ai_project_connections_operations_async.py b/sdk/ai/azure-ai-projects/generated_tests/test_ai_project_connections_operations_async.py deleted file mode 100644 index cc08499be0ee..000000000000 --- a/sdk/ai/azure-ai-projects/generated_tests/test_ai_project_connections_operations_async.py +++ /dev/null @@ -1,23 +0,0 @@ -# coding=utf-8 -# -------------------------------------------------------------------------- -# Copyright (c) Microsoft Corporation. All rights reserved. -# Licensed under the MIT License. See License.txt in the project root for license information. -# Code generated by Microsoft (R) Python Code Generator. -# Changes may cause incorrect behavior and will be lost if the code is regenerated. -# -------------------------------------------------------------------------- -import pytest -from devtools_testutils.aio import recorded_by_proxy_async -from testpreparer import AIProjectPreparer -from testpreparer_async import AIProjectClientTestBaseAsync - - -@pytest.mark.skip("you may need to update the auto-generated test case before run it") -class TestAIProjectConnectionsOperationsAsync(AIProjectClientTestBaseAsync): - @AIProjectPreparer() - @recorded_by_proxy_async - async def test_connections_list(self, aiproject_endpoint): - client = self.create_async_client(endpoint=aiproject_endpoint) - response = client.connections.list() - result = [r async for r in response] - # please add some check logic here by yourself - # ... diff --git a/sdk/ai/azure-ai-projects/generated_tests/test_ai_project_datasets_operations.py b/sdk/ai/azure-ai-projects/generated_tests/test_ai_project_datasets_operations.py deleted file mode 100644 index bdd6a44c053b..000000000000 --- a/sdk/ai/azure-ai-projects/generated_tests/test_ai_project_datasets_operations.py +++ /dev/null @@ -1,105 +0,0 @@ -# coding=utf-8 -# -------------------------------------------------------------------------- -# Copyright (c) Microsoft Corporation. All rights reserved. -# Licensed under the MIT License. See License.txt in the project root for license information. -# Code generated by Microsoft (R) Python Code Generator. -# Changes may cause incorrect behavior and will be lost if the code is regenerated. -# -------------------------------------------------------------------------- -import pytest -from devtools_testutils import recorded_by_proxy -from testpreparer import AIProjectClientTestBase, AIProjectPreparer - - -@pytest.mark.skip("you may need to update the auto-generated test case before run it") -class TestAIProjectDatasetsOperations(AIProjectClientTestBase): - @AIProjectPreparer() - @recorded_by_proxy - def test_datasets_list_versions(self, aiproject_endpoint): - client = self.create_client(endpoint=aiproject_endpoint) - response = client.datasets.list_versions( - name="str", - ) - result = [r for r in response] - # please add some check logic here by yourself - # ... - - @AIProjectPreparer() - @recorded_by_proxy - def test_datasets_list(self, aiproject_endpoint): - client = self.create_client(endpoint=aiproject_endpoint) - response = client.datasets.list() - result = [r for r in response] - # please add some check logic here by yourself - # ... - - @AIProjectPreparer() - @recorded_by_proxy - def test_datasets_get(self, aiproject_endpoint): - client = self.create_client(endpoint=aiproject_endpoint) - response = client.datasets.get( - name="str", - version="str", - ) - - # please add some check logic here by yourself - # ... - - @AIProjectPreparer() - @recorded_by_proxy - def test_datasets_delete(self, aiproject_endpoint): - client = self.create_client(endpoint=aiproject_endpoint) - response = client.datasets.delete( - name="str", - version="str", - ) - - # please add some check logic here by yourself - # ... - - @AIProjectPreparer() - @recorded_by_proxy - def test_datasets_create_or_update(self, aiproject_endpoint): - client = self.create_client(endpoint=aiproject_endpoint) - response = client.datasets.create_or_update( - name="str", - version="str", - body={ - "dataUri": "str", - "name": "str", - "type": "uri_file", - "version": "str", - "connectionName": "str", - "description": "str", - "id": "str", - "isReference": bool, - "tags": {"str": "str"}, - }, - ) - - # please add some check logic here by yourself - # ... - - @AIProjectPreparer() - @recorded_by_proxy - def test_datasets_pending_upload(self, aiproject_endpoint): - client = self.create_client(endpoint=aiproject_endpoint) - response = client.datasets.pending_upload( - name="str", - version="str", - body={"pendingUploadType": "str", "connectionName": "str", "pendingUploadId": "str"}, - ) - - # please add some check logic here by yourself - # ... - - @AIProjectPreparer() - @recorded_by_proxy - def test_datasets_get_credentials(self, aiproject_endpoint): - client = self.create_client(endpoint=aiproject_endpoint) - response = client.datasets.get_credentials( - name="str", - version="str", - ) - - # please add some check logic here by yourself - # ... diff --git a/sdk/ai/azure-ai-projects/generated_tests/test_ai_project_datasets_operations_async.py b/sdk/ai/azure-ai-projects/generated_tests/test_ai_project_datasets_operations_async.py deleted file mode 100644 index 6db1ecba7504..000000000000 --- a/sdk/ai/azure-ai-projects/generated_tests/test_ai_project_datasets_operations_async.py +++ /dev/null @@ -1,106 +0,0 @@ -# coding=utf-8 -# -------------------------------------------------------------------------- -# Copyright (c) Microsoft Corporation. All rights reserved. -# Licensed under the MIT License. See License.txt in the project root for license information. -# Code generated by Microsoft (R) Python Code Generator. -# Changes may cause incorrect behavior and will be lost if the code is regenerated. -# -------------------------------------------------------------------------- -import pytest -from devtools_testutils.aio import recorded_by_proxy_async -from testpreparer import AIProjectPreparer -from testpreparer_async import AIProjectClientTestBaseAsync - - -@pytest.mark.skip("you may need to update the auto-generated test case before run it") -class TestAIProjectDatasetsOperationsAsync(AIProjectClientTestBaseAsync): - @AIProjectPreparer() - @recorded_by_proxy_async - async def test_datasets_list_versions(self, aiproject_endpoint): - client = self.create_async_client(endpoint=aiproject_endpoint) - response = client.datasets.list_versions( - name="str", - ) - result = [r async for r in response] - # please add some check logic here by yourself - # ... - - @AIProjectPreparer() - @recorded_by_proxy_async - async def test_datasets_list(self, aiproject_endpoint): - client = self.create_async_client(endpoint=aiproject_endpoint) - response = client.datasets.list() - result = [r async for r in response] - # please add some check logic here by yourself - # ... - - @AIProjectPreparer() - @recorded_by_proxy_async - async def test_datasets_get(self, aiproject_endpoint): - client = self.create_async_client(endpoint=aiproject_endpoint) - response = await client.datasets.get( - name="str", - version="str", - ) - - # please add some check logic here by yourself - # ... - - @AIProjectPreparer() - @recorded_by_proxy_async - async def test_datasets_delete(self, aiproject_endpoint): - client = self.create_async_client(endpoint=aiproject_endpoint) - response = await client.datasets.delete( - name="str", - version="str", - ) - - # please add some check logic here by yourself - # ... - - @AIProjectPreparer() - @recorded_by_proxy_async - async def test_datasets_create_or_update(self, aiproject_endpoint): - client = self.create_async_client(endpoint=aiproject_endpoint) - response = await client.datasets.create_or_update( - name="str", - version="str", - body={ - "dataUri": "str", - "name": "str", - "type": "uri_file", - "version": "str", - "connectionName": "str", - "description": "str", - "id": "str", - "isReference": bool, - "tags": {"str": "str"}, - }, - ) - - # please add some check logic here by yourself - # ... - - @AIProjectPreparer() - @recorded_by_proxy_async - async def test_datasets_pending_upload(self, aiproject_endpoint): - client = self.create_async_client(endpoint=aiproject_endpoint) - response = await client.datasets.pending_upload( - name="str", - version="str", - body={"pendingUploadType": "str", "connectionName": "str", "pendingUploadId": "str"}, - ) - - # please add some check logic here by yourself - # ... - - @AIProjectPreparer() - @recorded_by_proxy_async - async def test_datasets_get_credentials(self, aiproject_endpoint): - client = self.create_async_client(endpoint=aiproject_endpoint) - response = await client.datasets.get_credentials( - name="str", - version="str", - ) - - # please add some check logic here by yourself - # ... diff --git a/sdk/ai/azure-ai-projects/generated_tests/test_ai_project_deployments_operations.py b/sdk/ai/azure-ai-projects/generated_tests/test_ai_project_deployments_operations.py deleted file mode 100644 index b0e1e586d866..000000000000 --- a/sdk/ai/azure-ai-projects/generated_tests/test_ai_project_deployments_operations.py +++ /dev/null @@ -1,33 +0,0 @@ -# coding=utf-8 -# -------------------------------------------------------------------------- -# Copyright (c) Microsoft Corporation. All rights reserved. -# Licensed under the MIT License. See License.txt in the project root for license information. -# Code generated by Microsoft (R) Python Code Generator. -# Changes may cause incorrect behavior and will be lost if the code is regenerated. -# -------------------------------------------------------------------------- -import pytest -from devtools_testutils import recorded_by_proxy -from testpreparer import AIProjectClientTestBase, AIProjectPreparer - - -@pytest.mark.skip("you may need to update the auto-generated test case before run it") -class TestAIProjectDeploymentsOperations(AIProjectClientTestBase): - @AIProjectPreparer() - @recorded_by_proxy - def test_deployments_get(self, aiproject_endpoint): - client = self.create_client(endpoint=aiproject_endpoint) - response = client.deployments.get( - name="str", - ) - - # please add some check logic here by yourself - # ... - - @AIProjectPreparer() - @recorded_by_proxy - def test_deployments_list(self, aiproject_endpoint): - client = self.create_client(endpoint=aiproject_endpoint) - response = client.deployments.list() - result = [r for r in response] - # please add some check logic here by yourself - # ... diff --git a/sdk/ai/azure-ai-projects/generated_tests/test_ai_project_deployments_operations_async.py b/sdk/ai/azure-ai-projects/generated_tests/test_ai_project_deployments_operations_async.py deleted file mode 100644 index 3958d83eab29..000000000000 --- a/sdk/ai/azure-ai-projects/generated_tests/test_ai_project_deployments_operations_async.py +++ /dev/null @@ -1,34 +0,0 @@ -# coding=utf-8 -# -------------------------------------------------------------------------- -# Copyright (c) Microsoft Corporation. All rights reserved. -# Licensed under the MIT License. See License.txt in the project root for license information. -# Code generated by Microsoft (R) Python Code Generator. -# Changes may cause incorrect behavior and will be lost if the code is regenerated. -# -------------------------------------------------------------------------- -import pytest -from devtools_testutils.aio import recorded_by_proxy_async -from testpreparer import AIProjectPreparer -from testpreparer_async import AIProjectClientTestBaseAsync - - -@pytest.mark.skip("you may need to update the auto-generated test case before run it") -class TestAIProjectDeploymentsOperationsAsync(AIProjectClientTestBaseAsync): - @AIProjectPreparer() - @recorded_by_proxy_async - async def test_deployments_get(self, aiproject_endpoint): - client = self.create_async_client(endpoint=aiproject_endpoint) - response = await client.deployments.get( - name="str", - ) - - # please add some check logic here by yourself - # ... - - @AIProjectPreparer() - @recorded_by_proxy_async - async def test_deployments_list(self, aiproject_endpoint): - client = self.create_async_client(endpoint=aiproject_endpoint) - response = client.deployments.list() - result = [r async for r in response] - # please add some check logic here by yourself - # ... diff --git a/sdk/ai/azure-ai-projects/generated_tests/test_ai_project_evaluation_results_operations.py b/sdk/ai/azure-ai-projects/generated_tests/test_ai_project_evaluation_results_operations.py deleted file mode 100644 index b68d4d88d17a..000000000000 --- a/sdk/ai/azure-ai-projects/generated_tests/test_ai_project_evaluation_results_operations.py +++ /dev/null @@ -1,125 +0,0 @@ -# coding=utf-8 -# -------------------------------------------------------------------------- -# Copyright (c) Microsoft Corporation. All rights reserved. -# Licensed under the MIT License. See License.txt in the project root for license information. -# Code generated by Microsoft (R) Python Code Generator. -# Changes may cause incorrect behavior and will be lost if the code is regenerated. -# -------------------------------------------------------------------------- -import pytest -from devtools_testutils import recorded_by_proxy -from testpreparer import AIProjectClientTestBase, AIProjectPreparer - - -@pytest.mark.skip("you may need to update the auto-generated test case before run it") -class TestAIProjectEvaluationResultsOperations(AIProjectClientTestBase): - @AIProjectPreparer() - @recorded_by_proxy - def test_evaluation_results_list_versions(self, aiproject_endpoint): - client = self.create_client(endpoint=aiproject_endpoint) - response = client.evaluation_results.list_versions( - name="str", - ) - result = [r for r in response] - # please add some check logic here by yourself - # ... - - @AIProjectPreparer() - @recorded_by_proxy - def test_evaluation_results_list_latest(self, aiproject_endpoint): - client = self.create_client(endpoint=aiproject_endpoint) - response = client.evaluation_results.list_latest() - result = [r for r in response] - # please add some check logic here by yourself - # ... - - @AIProjectPreparer() - @recorded_by_proxy - def test_evaluation_results_get_version(self, aiproject_endpoint): - client = self.create_client(endpoint=aiproject_endpoint) - response = client.evaluation_results.get_version( - name="str", - version="str", - ) - - # please add some check logic here by yourself - # ... - - @AIProjectPreparer() - @recorded_by_proxy - def test_evaluation_results_delete_version(self, aiproject_endpoint): - client = self.create_client(endpoint=aiproject_endpoint) - response = client.evaluation_results.delete_version( - name="str", - version="str", - ) - - # please add some check logic here by yourself - # ... - - @AIProjectPreparer() - @recorded_by_proxy - def test_evaluation_results_create(self, aiproject_endpoint): - client = self.create_client(endpoint=aiproject_endpoint) - response = client.evaluation_results.create( - name="str", - body={ - "name": "str", - "version": "str", - "BlobUri": "str", - "DatasetFamily": "str", - "DatasetName": "str", - "Metrics": {"str": 0.0}, - "ModelAssetId": "str", - "ModelName": "str", - "ModelVersion": "str", - "ResultType": "str", - "description": "str", - "id": "str", - "stage": "str", - "tags": {"str": "str"}, - }, - ) - - # please add some check logic here by yourself - # ... - - @AIProjectPreparer() - @recorded_by_proxy - def test_evaluation_results_create_version(self, aiproject_endpoint): - client = self.create_client(endpoint=aiproject_endpoint) - response = client.evaluation_results.create_version( - name="str", - version="str", - body={ - "name": "str", - "version": "str", - "BlobUri": "str", - "DatasetFamily": "str", - "DatasetName": "str", - "Metrics": {"str": 0.0}, - "ModelAssetId": "str", - "ModelName": "str", - "ModelVersion": "str", - "ResultType": "str", - "description": "str", - "id": "str", - "stage": "str", - "tags": {"str": "str"}, - }, - ) - - # please add some check logic here by yourself - # ... - - @AIProjectPreparer() - @recorded_by_proxy - def test_evaluation_results_start_pending_upload(self, aiproject_endpoint): - client = self.create_client(endpoint=aiproject_endpoint) - response = client.evaluation_results.start_pending_upload( - name="str", - version="str", - body={"pendingUploadType": "str", "connectionName": "str", "pendingUploadId": "str"}, - ) - - # please add some check logic here by yourself - # ... diff --git a/sdk/ai/azure-ai-projects/generated_tests/test_ai_project_evaluation_results_operations_async.py b/sdk/ai/azure-ai-projects/generated_tests/test_ai_project_evaluation_results_operations_async.py deleted file mode 100644 index b90df81464cd..000000000000 --- a/sdk/ai/azure-ai-projects/generated_tests/test_ai_project_evaluation_results_operations_async.py +++ /dev/null @@ -1,126 +0,0 @@ -# coding=utf-8 -# -------------------------------------------------------------------------- -# Copyright (c) Microsoft Corporation. All rights reserved. -# Licensed under the MIT License. See License.txt in the project root for license information. -# Code generated by Microsoft (R) Python Code Generator. -# Changes may cause incorrect behavior and will be lost if the code is regenerated. -# -------------------------------------------------------------------------- -import pytest -from devtools_testutils.aio import recorded_by_proxy_async -from testpreparer import AIProjectPreparer -from testpreparer_async import AIProjectClientTestBaseAsync - - -@pytest.mark.skip("you may need to update the auto-generated test case before run it") -class TestAIProjectEvaluationResultsOperationsAsync(AIProjectClientTestBaseAsync): - @AIProjectPreparer() - @recorded_by_proxy_async - async def test_evaluation_results_list_versions(self, aiproject_endpoint): - client = self.create_async_client(endpoint=aiproject_endpoint) - response = client.evaluation_results.list_versions( - name="str", - ) - result = [r async for r in response] - # please add some check logic here by yourself - # ... - - @AIProjectPreparer() - @recorded_by_proxy_async - async def test_evaluation_results_list_latest(self, aiproject_endpoint): - client = self.create_async_client(endpoint=aiproject_endpoint) - response = client.evaluation_results.list_latest() - result = [r async for r in response] - # please add some check logic here by yourself - # ... - - @AIProjectPreparer() - @recorded_by_proxy_async - async def test_evaluation_results_get_version(self, aiproject_endpoint): - client = self.create_async_client(endpoint=aiproject_endpoint) - response = await client.evaluation_results.get_version( - name="str", - version="str", - ) - - # please add some check logic here by yourself - # ... - - @AIProjectPreparer() - @recorded_by_proxy_async - async def test_evaluation_results_delete_version(self, aiproject_endpoint): - client = self.create_async_client(endpoint=aiproject_endpoint) - response = await client.evaluation_results.delete_version( - name="str", - version="str", - ) - - # please add some check logic here by yourself - # ... - - @AIProjectPreparer() - @recorded_by_proxy_async - async def test_evaluation_results_create(self, aiproject_endpoint): - client = self.create_async_client(endpoint=aiproject_endpoint) - response = await client.evaluation_results.create( - name="str", - body={ - "name": "str", - "version": "str", - "BlobUri": "str", - "DatasetFamily": "str", - "DatasetName": "str", - "Metrics": {"str": 0.0}, - "ModelAssetId": "str", - "ModelName": "str", - "ModelVersion": "str", - "ResultType": "str", - "description": "str", - "id": "str", - "stage": "str", - "tags": {"str": "str"}, - }, - ) - - # please add some check logic here by yourself - # ... - - @AIProjectPreparer() - @recorded_by_proxy_async - async def test_evaluation_results_create_version(self, aiproject_endpoint): - client = self.create_async_client(endpoint=aiproject_endpoint) - response = await client.evaluation_results.create_version( - name="str", - version="str", - body={ - "name": "str", - "version": "str", - "BlobUri": "str", - "DatasetFamily": "str", - "DatasetName": "str", - "Metrics": {"str": 0.0}, - "ModelAssetId": "str", - "ModelName": "str", - "ModelVersion": "str", - "ResultType": "str", - "description": "str", - "id": "str", - "stage": "str", - "tags": {"str": "str"}, - }, - ) - - # please add some check logic here by yourself - # ... - - @AIProjectPreparer() - @recorded_by_proxy_async - async def test_evaluation_results_start_pending_upload(self, aiproject_endpoint): - client = self.create_async_client(endpoint=aiproject_endpoint) - response = await client.evaluation_results.start_pending_upload( - name="str", - version="str", - body={"pendingUploadType": "str", "connectionName": "str", "pendingUploadId": "str"}, - ) - - # please add some check logic here by yourself - # ... diff --git a/sdk/ai/azure-ai-projects/generated_tests/test_ai_project_evaluations_operations.py b/sdk/ai/azure-ai-projects/generated_tests/test_ai_project_evaluations_operations.py deleted file mode 100644 index e07aa0e02b47..000000000000 --- a/sdk/ai/azure-ai-projects/generated_tests/test_ai_project_evaluations_operations.py +++ /dev/null @@ -1,71 +0,0 @@ -# coding=utf-8 -# -------------------------------------------------------------------------- -# Copyright (c) Microsoft Corporation. All rights reserved. -# Licensed under the MIT License. See License.txt in the project root for license information. -# Code generated by Microsoft (R) Python Code Generator. -# Changes may cause incorrect behavior and will be lost if the code is regenerated. -# -------------------------------------------------------------------------- -import pytest -from devtools_testutils import recorded_by_proxy -from testpreparer import AIProjectClientTestBase, AIProjectPreparer - - -@pytest.mark.skip("you may need to update the auto-generated test case before run it") -class TestAIProjectEvaluationsOperations(AIProjectClientTestBase): - @AIProjectPreparer() - @recorded_by_proxy - def test_evaluations_get(self, aiproject_endpoint): - client = self.create_client(endpoint=aiproject_endpoint) - response = client.evaluations.get( - name="str", - ) - - # please add some check logic here by yourself - # ... - - @AIProjectPreparer() - @recorded_by_proxy - def test_evaluations_list(self, aiproject_endpoint): - client = self.create_client(endpoint=aiproject_endpoint) - response = client.evaluations.list() - result = [r for r in response] - # please add some check logic here by yourself - # ... - - @AIProjectPreparer() - @recorded_by_proxy - def test_evaluations_create(self, aiproject_endpoint): - client = self.create_client(endpoint=aiproject_endpoint) - response = client.evaluations.create( - evaluation={ - "data": "input_data", - "evaluators": {"str": {"id": "str", "dataMapping": {"str": "str"}, "initParams": {"str": {}}}}, - "id": "str", - "description": "str", - "displayName": "str", - "properties": {"str": "str"}, - "status": "str", - "tags": {"str": "str"}, - }, - ) - - # please add some check logic here by yourself - # ... - - @AIProjectPreparer() - @recorded_by_proxy - def test_evaluations_create_agent_evaluation(self, aiproject_endpoint): - client = self.create_client(endpoint=aiproject_endpoint) - response = client.evaluations.create_agent_evaluation( - evaluation={ - "appInsightsConnectionString": "str", - "evaluators": {"str": {"id": "str", "dataMapping": {"str": "str"}, "initParams": {"str": {}}}}, - "runId": "str", - "redactionConfiguration": {"redactScoreProperties": bool}, - "samplingConfiguration": {"maxRequestRate": 0.0, "name": "str", "samplingPercent": 0.0}, - "threadId": "str", - }, - ) - - # please add some check logic here by yourself - # ... diff --git a/sdk/ai/azure-ai-projects/generated_tests/test_ai_project_evaluations_operations_async.py b/sdk/ai/azure-ai-projects/generated_tests/test_ai_project_evaluations_operations_async.py deleted file mode 100644 index 07f22bd9e58a..000000000000 --- a/sdk/ai/azure-ai-projects/generated_tests/test_ai_project_evaluations_operations_async.py +++ /dev/null @@ -1,72 +0,0 @@ -# coding=utf-8 -# -------------------------------------------------------------------------- -# Copyright (c) Microsoft Corporation. All rights reserved. -# Licensed under the MIT License. See License.txt in the project root for license information. -# Code generated by Microsoft (R) Python Code Generator. -# Changes may cause incorrect behavior and will be lost if the code is regenerated. -# -------------------------------------------------------------------------- -import pytest -from devtools_testutils.aio import recorded_by_proxy_async -from testpreparer import AIProjectPreparer -from testpreparer_async import AIProjectClientTestBaseAsync - - -@pytest.mark.skip("you may need to update the auto-generated test case before run it") -class TestAIProjectEvaluationsOperationsAsync(AIProjectClientTestBaseAsync): - @AIProjectPreparer() - @recorded_by_proxy_async - async def test_evaluations_get(self, aiproject_endpoint): - client = self.create_async_client(endpoint=aiproject_endpoint) - response = await client.evaluations.get( - name="str", - ) - - # please add some check logic here by yourself - # ... - - @AIProjectPreparer() - @recorded_by_proxy_async - async def test_evaluations_list(self, aiproject_endpoint): - client = self.create_async_client(endpoint=aiproject_endpoint) - response = client.evaluations.list() - result = [r async for r in response] - # please add some check logic here by yourself - # ... - - @AIProjectPreparer() - @recorded_by_proxy_async - async def test_evaluations_create(self, aiproject_endpoint): - client = self.create_async_client(endpoint=aiproject_endpoint) - response = await client.evaluations.create( - evaluation={ - "data": "input_data", - "evaluators": {"str": {"id": "str", "dataMapping": {"str": "str"}, "initParams": {"str": {}}}}, - "id": "str", - "description": "str", - "displayName": "str", - "properties": {"str": "str"}, - "status": "str", - "tags": {"str": "str"}, - }, - ) - - # please add some check logic here by yourself - # ... - - @AIProjectPreparer() - @recorded_by_proxy_async - async def test_evaluations_create_agent_evaluation(self, aiproject_endpoint): - client = self.create_async_client(endpoint=aiproject_endpoint) - response = await client.evaluations.create_agent_evaluation( - evaluation={ - "appInsightsConnectionString": "str", - "evaluators": {"str": {"id": "str", "dataMapping": {"str": "str"}, "initParams": {"str": {}}}}, - "runId": "str", - "redactionConfiguration": {"redactScoreProperties": bool}, - "samplingConfiguration": {"maxRequestRate": 0.0, "name": "str", "samplingPercent": 0.0}, - "threadId": "str", - }, - ) - - # please add some check logic here by yourself - # ... diff --git a/sdk/ai/azure-ai-projects/generated_tests/test_ai_project_indexes_operations.py b/sdk/ai/azure-ai-projects/generated_tests/test_ai_project_indexes_operations.py deleted file mode 100644 index 82f33d5188bd..000000000000 --- a/sdk/ai/azure-ai-projects/generated_tests/test_ai_project_indexes_operations.py +++ /dev/null @@ -1,87 +0,0 @@ -# coding=utf-8 -# -------------------------------------------------------------------------- -# Copyright (c) Microsoft Corporation. All rights reserved. -# Licensed under the MIT License. See License.txt in the project root for license information. -# Code generated by Microsoft (R) Python Code Generator. -# Changes may cause incorrect behavior and will be lost if the code is regenerated. -# -------------------------------------------------------------------------- -import pytest -from devtools_testutils import recorded_by_proxy -from testpreparer import AIProjectClientTestBase, AIProjectPreparer - - -@pytest.mark.skip("you may need to update the auto-generated test case before run it") -class TestAIProjectIndexesOperations(AIProjectClientTestBase): - @AIProjectPreparer() - @recorded_by_proxy - def test_indexes_list_versions(self, aiproject_endpoint): - client = self.create_client(endpoint=aiproject_endpoint) - response = client.indexes.list_versions( - name="str", - ) - result = [r for r in response] - # please add some check logic here by yourself - # ... - - @AIProjectPreparer() - @recorded_by_proxy - def test_indexes_list(self, aiproject_endpoint): - client = self.create_client(endpoint=aiproject_endpoint) - response = client.indexes.list() - result = [r for r in response] - # please add some check logic here by yourself - # ... - - @AIProjectPreparer() - @recorded_by_proxy - def test_indexes_get(self, aiproject_endpoint): - client = self.create_client(endpoint=aiproject_endpoint) - response = client.indexes.get( - name="str", - version="str", - ) - - # please add some check logic here by yourself - # ... - - @AIProjectPreparer() - @recorded_by_proxy - def test_indexes_delete(self, aiproject_endpoint): - client = self.create_client(endpoint=aiproject_endpoint) - response = client.indexes.delete( - name="str", - version="str", - ) - - # please add some check logic here by yourself - # ... - - @AIProjectPreparer() - @recorded_by_proxy - def test_indexes_create_or_update(self, aiproject_endpoint): - client = self.create_client(endpoint=aiproject_endpoint) - response = client.indexes.create_or_update( - name="str", - version="str", - body={ - "connectionName": "str", - "indexName": "str", - "name": "str", - "type": "AzureSearch", - "version": "str", - "description": "str", - "fieldMapping": { - "contentFields": ["str"], - "filepathField": "str", - "metadataFields": ["str"], - "titleField": "str", - "urlField": "str", - "vectorFields": ["str"], - }, - "id": "str", - "tags": {"str": "str"}, - }, - ) - - # please add some check logic here by yourself - # ... diff --git a/sdk/ai/azure-ai-projects/generated_tests/test_ai_project_indexes_operations_async.py b/sdk/ai/azure-ai-projects/generated_tests/test_ai_project_indexes_operations_async.py deleted file mode 100644 index 53812b80aa1d..000000000000 --- a/sdk/ai/azure-ai-projects/generated_tests/test_ai_project_indexes_operations_async.py +++ /dev/null @@ -1,88 +0,0 @@ -# coding=utf-8 -# -------------------------------------------------------------------------- -# Copyright (c) Microsoft Corporation. All rights reserved. -# Licensed under the MIT License. See License.txt in the project root for license information. -# Code generated by Microsoft (R) Python Code Generator. -# Changes may cause incorrect behavior and will be lost if the code is regenerated. -# -------------------------------------------------------------------------- -import pytest -from devtools_testutils.aio import recorded_by_proxy_async -from testpreparer import AIProjectPreparer -from testpreparer_async import AIProjectClientTestBaseAsync - - -@pytest.mark.skip("you may need to update the auto-generated test case before run it") -class TestAIProjectIndexesOperationsAsync(AIProjectClientTestBaseAsync): - @AIProjectPreparer() - @recorded_by_proxy_async - async def test_indexes_list_versions(self, aiproject_endpoint): - client = self.create_async_client(endpoint=aiproject_endpoint) - response = client.indexes.list_versions( - name="str", - ) - result = [r async for r in response] - # please add some check logic here by yourself - # ... - - @AIProjectPreparer() - @recorded_by_proxy_async - async def test_indexes_list(self, aiproject_endpoint): - client = self.create_async_client(endpoint=aiproject_endpoint) - response = client.indexes.list() - result = [r async for r in response] - # please add some check logic here by yourself - # ... - - @AIProjectPreparer() - @recorded_by_proxy_async - async def test_indexes_get(self, aiproject_endpoint): - client = self.create_async_client(endpoint=aiproject_endpoint) - response = await client.indexes.get( - name="str", - version="str", - ) - - # please add some check logic here by yourself - # ... - - @AIProjectPreparer() - @recorded_by_proxy_async - async def test_indexes_delete(self, aiproject_endpoint): - client = self.create_async_client(endpoint=aiproject_endpoint) - response = await client.indexes.delete( - name="str", - version="str", - ) - - # please add some check logic here by yourself - # ... - - @AIProjectPreparer() - @recorded_by_proxy_async - async def test_indexes_create_or_update(self, aiproject_endpoint): - client = self.create_async_client(endpoint=aiproject_endpoint) - response = await client.indexes.create_or_update( - name="str", - version="str", - body={ - "connectionName": "str", - "indexName": "str", - "name": "str", - "type": "AzureSearch", - "version": "str", - "description": "str", - "fieldMapping": { - "contentFields": ["str"], - "filepathField": "str", - "metadataFields": ["str"], - "titleField": "str", - "urlField": "str", - "vectorFields": ["str"], - }, - "id": "str", - "tags": {"str": "str"}, - }, - ) - - # please add some check logic here by yourself - # ... diff --git a/sdk/ai/azure-ai-projects/generated_tests/test_ai_project_red_teams_operations.py b/sdk/ai/azure-ai-projects/generated_tests/test_ai_project_red_teams_operations.py deleted file mode 100644 index 8cb4893cbb4c..000000000000 --- a/sdk/ai/azure-ai-projects/generated_tests/test_ai_project_red_teams_operations.py +++ /dev/null @@ -1,56 +0,0 @@ -# coding=utf-8 -# -------------------------------------------------------------------------- -# Copyright (c) Microsoft Corporation. All rights reserved. -# Licensed under the MIT License. See License.txt in the project root for license information. -# Code generated by Microsoft (R) Python Code Generator. -# Changes may cause incorrect behavior and will be lost if the code is regenerated. -# -------------------------------------------------------------------------- -import pytest -from devtools_testutils import recorded_by_proxy -from testpreparer import AIProjectClientTestBase, AIProjectPreparer - - -@pytest.mark.skip("you may need to update the auto-generated test case before run it") -class TestAIProjectRedTeamsOperations(AIProjectClientTestBase): - @AIProjectPreparer() - @recorded_by_proxy - def test_red_teams_get(self, aiproject_endpoint): - client = self.create_client(endpoint=aiproject_endpoint) - response = client.red_teams.get( - name="str", - ) - - # please add some check logic here by yourself - # ... - - @AIProjectPreparer() - @recorded_by_proxy - def test_red_teams_list(self, aiproject_endpoint): - client = self.create_client(endpoint=aiproject_endpoint) - response = client.red_teams.list() - result = [r for r in response] - # please add some check logic here by yourself - # ... - - @AIProjectPreparer() - @recorded_by_proxy - def test_red_teams_create(self, aiproject_endpoint): - client = self.create_client(endpoint=aiproject_endpoint) - response = client.red_teams.create( - red_team={ - "id": "str", - "target": "target_config", - "applicationScenario": "str", - "attackStrategies": ["str"], - "displayName": "str", - "numTurns": 0, - "properties": {"str": "str"}, - "riskCategories": ["str"], - "simulationOnly": bool, - "status": "str", - "tags": {"str": "str"}, - }, - ) - - # please add some check logic here by yourself - # ... diff --git a/sdk/ai/azure-ai-projects/generated_tests/test_ai_project_red_teams_operations_async.py b/sdk/ai/azure-ai-projects/generated_tests/test_ai_project_red_teams_operations_async.py deleted file mode 100644 index dc93a4d14181..000000000000 --- a/sdk/ai/azure-ai-projects/generated_tests/test_ai_project_red_teams_operations_async.py +++ /dev/null @@ -1,57 +0,0 @@ -# coding=utf-8 -# -------------------------------------------------------------------------- -# Copyright (c) Microsoft Corporation. All rights reserved. -# Licensed under the MIT License. See License.txt in the project root for license information. -# Code generated by Microsoft (R) Python Code Generator. -# Changes may cause incorrect behavior and will be lost if the code is regenerated. -# -------------------------------------------------------------------------- -import pytest -from devtools_testutils.aio import recorded_by_proxy_async -from testpreparer import AIProjectPreparer -from testpreparer_async import AIProjectClientTestBaseAsync - - -@pytest.mark.skip("you may need to update the auto-generated test case before run it") -class TestAIProjectRedTeamsOperationsAsync(AIProjectClientTestBaseAsync): - @AIProjectPreparer() - @recorded_by_proxy_async - async def test_red_teams_get(self, aiproject_endpoint): - client = self.create_async_client(endpoint=aiproject_endpoint) - response = await client.red_teams.get( - name="str", - ) - - # please add some check logic here by yourself - # ... - - @AIProjectPreparer() - @recorded_by_proxy_async - async def test_red_teams_list(self, aiproject_endpoint): - client = self.create_async_client(endpoint=aiproject_endpoint) - response = client.red_teams.list() - result = [r async for r in response] - # please add some check logic here by yourself - # ... - - @AIProjectPreparer() - @recorded_by_proxy_async - async def test_red_teams_create(self, aiproject_endpoint): - client = self.create_async_client(endpoint=aiproject_endpoint) - response = await client.red_teams.create( - red_team={ - "id": "str", - "target": "target_config", - "applicationScenario": "str", - "attackStrategies": ["str"], - "displayName": "str", - "numTurns": 0, - "properties": {"str": "str"}, - "riskCategories": ["str"], - "simulationOnly": bool, - "status": "str", - "tags": {"str": "str"}, - }, - ) - - # please add some check logic here by yourself - # ... diff --git a/sdk/ai/azure-ai-projects/generated_tests/testpreparer.py b/sdk/ai/azure-ai-projects/generated_tests/testpreparer.py deleted file mode 100644 index 69c9aaa6e8d1..000000000000 --- a/sdk/ai/azure-ai-projects/generated_tests/testpreparer.py +++ /dev/null @@ -1,26 +0,0 @@ -# coding=utf-8 -# -------------------------------------------------------------------------- -# Copyright (c) Microsoft Corporation. All rights reserved. -# Licensed under the MIT License. See License.txt in the project root for license information. -# Code generated by Microsoft (R) Python Code Generator. -# Changes may cause incorrect behavior and will be lost if the code is regenerated. -# -------------------------------------------------------------------------- -from azure.ai.projects import AIProjectClient -from devtools_testutils import AzureRecordedTestCase, PowerShellPreparer -import functools - - -class AIProjectClientTestBase(AzureRecordedTestCase): - - def create_client(self, endpoint): - credential = self.get_credential(AIProjectClient) - return self.create_client_from_credential( - AIProjectClient, - credential=credential, - endpoint=endpoint, - ) - - -AIProjectPreparer = functools.partial( - PowerShellPreparer, "aiproject", aiproject_endpoint="https://fake_aiproject_endpoint.com" -) diff --git a/sdk/ai/azure-ai-projects/generated_tests/testpreparer_async.py b/sdk/ai/azure-ai-projects/generated_tests/testpreparer_async.py deleted file mode 100644 index 56353f9fdd65..000000000000 --- a/sdk/ai/azure-ai-projects/generated_tests/testpreparer_async.py +++ /dev/null @@ -1,20 +0,0 @@ -# coding=utf-8 -# -------------------------------------------------------------------------- -# Copyright (c) Microsoft Corporation. All rights reserved. -# Licensed under the MIT License. See License.txt in the project root for license information. -# Code generated by Microsoft (R) Python Code Generator. -# Changes may cause incorrect behavior and will be lost if the code is regenerated. -# -------------------------------------------------------------------------- -from azure.ai.projects.aio import AIProjectClient -from devtools_testutils import AzureRecordedTestCase - - -class AIProjectClientTestBaseAsync(AzureRecordedTestCase): - - def create_async_client(self, endpoint): - credential = self.get_credential(AIProjectClient, is_async=True) - return self.create_client_from_credential( - AIProjectClient, - credential=credential, - endpoint=endpoint, - ) diff --git a/sdk/ai/azure-ai-projects/samples/agents/README.md b/sdk/ai/azure-ai-projects/samples/agents/README.md new file mode 100644 index 000000000000..b03e553bb835 --- /dev/null +++ b/sdk/ai/azure-ai-projects/samples/agents/README.md @@ -0,0 +1,8 @@ +# Agents samples + +This directory intentionally contains only one sample. + +The full set of Agent samples can be found in the [samples folder](https://github.com/Azure/azure-sdk-for-python/tree/main/sdk/ai/azure-ai-agents/samples) of the `azure-ai-agents` package. + +See also `azure-ai-agents` package [README.md](https://github.com/Azure/azure-sdk-for-python/tree/main/sdk/ai/azure-ai-agents). + diff --git a/sdk/ai/azure-ai-projects/samples/indexes/sample_indexes.py b/sdk/ai/azure-ai-projects/samples/indexes/sample_indexes.py index 50196b0a3095..c8a77e18d301 100644 --- a/sdk/ai/azure-ai-projects/samples/indexes/sample_indexes.py +++ b/sdk/ai/azure-ai-projects/samples/indexes/sample_indexes.py @@ -47,7 +47,7 @@ index = project_client.indexes.create_or_update( name=index_name, version=index_version, - body=AzureAISearchIndex(connection_name=ai_search_connection_name, index_name=ai_search_index_name), + index=AzureAISearchIndex(connection_name=ai_search_connection_name, index_name=ai_search_index_name), ) print(index) diff --git a/sdk/ai/azure-ai-projects/samples/indexes/sample_indexes_async.py b/sdk/ai/azure-ai-projects/samples/indexes/sample_indexes_async.py index aa1bc8f6f46f..36d2d1a3aca4 100644 --- a/sdk/ai/azure-ai-projects/samples/indexes/sample_indexes_async.py +++ b/sdk/ai/azure-ai-projects/samples/indexes/sample_indexes_async.py @@ -50,7 +50,7 @@ async def main() -> None: index = await project_client.indexes.create_or_update( name=index_name, version=index_version, - body=AzureAISearchIndex(connection_name=ai_search_connection_name, index_name=ai_search_index_name), + index=AzureAISearchIndex(connection_name=ai_search_connection_name, index_name=ai_search_index_name), ) print(index) diff --git a/sdk/ai/azure-ai-projects/samples/inference/async_samples/sample_text_embeddings_with_azure_ai_inference_client_async.py b/sdk/ai/azure-ai-projects/samples/inference/async_samples/sample_text_embeddings_with_azure_ai_inference_client_async.py deleted file mode 100644 index 2b550ba02687..000000000000 --- a/sdk/ai/azure-ai-projects/samples/inference/async_samples/sample_text_embeddings_with_azure_ai_inference_client_async.py +++ /dev/null @@ -1,56 +0,0 @@ -# ------------------------------------ -# Copyright (c) Microsoft Corporation. -# Licensed under the MIT License. -# ------------------------------------ - -""" -DESCRIPTION: - Given an AIProjectClient, this sample demonstrates how to get an authenticated - async EmbeddingsClient from the azure.ai.inference package, and perform one text - embeddings operation. For more information on the azure.ai.inference package see - https://pypi.org/project/azure-ai-inference/. - -USAGE: - python sample_text_embeddings_with_azure_ai_inference_client_async.py - - Before running the sample: - - pip install azure-ai-projects azure-ai-inference aiohttp azure-identity - - Set these environment variables with your own values: - 1) PROJECT_ENDPOINT - The Azure AI Project endpoint, as found in the overview page of your - Azure AI Foundry project. - 2) MODEL_DEPLOYMENT_NAME - The AI model deployment name, as found in your AI Foundry project. -""" - -import os -import asyncio -from azure.identity.aio import DefaultAzureCredential -from azure.ai.projects.aio import AIProjectClient - - -async def main(): - - endpoint = os.environ["PROJECT_ENDPOINT"] - model_deployment_name = os.environ["MODEL_DEPLOYMENT_NAME"] - - async with DefaultAzureCredential() as credential: - - async with AIProjectClient(endpoint=endpoint, credential=credential) as project_client: - - async with project_client.inference.get_embeddings_client() as client: - - response = await client.embed( - model=model_deployment_name, input=["first phrase", "second phrase", "third phrase"] - ) - - for item in response.data: - length = len(item.embedding) - print( - f"data[{item.index}]: length={length}, [{item.embedding[0]}, {item.embedding[1]}, " - f"..., {item.embedding[length-2]}, {item.embedding[length-1]}]" - ) - - -if __name__ == "__main__": - asyncio.run(main()) diff --git a/sdk/ai/azure-ai-projects/samples/inference/async_samples/sample1.png b/sdk/ai/azure-ai-projects/samples/inference/azure-ai-inference/async_samples/sample1.png similarity index 100% rename from sdk/ai/azure-ai-projects/samples/inference/async_samples/sample1.png rename to sdk/ai/azure-ai-projects/samples/inference/azure-ai-inference/async_samples/sample1.png diff --git a/sdk/ai/azure-ai-projects/samples/inference/async_samples/sample_chat_completions_with_azure_ai_inference_client_async.py b/sdk/ai/azure-ai-projects/samples/inference/azure-ai-inference/async_samples/sample_chat_completions_with_azure_ai_inference_client_async.py similarity index 54% rename from sdk/ai/azure-ai-projects/samples/inference/async_samples/sample_chat_completions_with_azure_ai_inference_client_async.py rename to sdk/ai/azure-ai-projects/samples/inference/azure-ai-inference/async_samples/sample_chat_completions_with_azure_ai_inference_client_async.py index 55d88d8904c5..d6771ef17c2f 100644 --- a/sdk/ai/azure-ai-projects/samples/inference/async_samples/sample_chat_completions_with_azure_ai_inference_client_async.py +++ b/sdk/ai/azure-ai-projects/samples/inference/azure-ai-inference/async_samples/sample_chat_completions_with_azure_ai_inference_client_async.py @@ -1,3 +1,4 @@ +# pylint: disable=line-too-long,useless-suppression # ------------------------------------ # Copyright (c) Microsoft Corporation. # Licensed under the MIT License. @@ -5,7 +6,7 @@ """ DESCRIPTION: - Given an AIProjectClient, this sample demonstrates how to get an authenticated + Given an AI Foundry Project endpoint, this sample demonstrates how to get an authenticated async ChatCompletionsClient from the azure.ai.inference package, and perform one chat completions operation. For more information on the azure.ai.inference package see https://pypi.org/project/azure-ai-inference/. @@ -25,8 +26,9 @@ import os import asyncio +from urllib.parse import urlparse from azure.identity.aio import DefaultAzureCredential -from azure.ai.projects.aio import AIProjectClient +from azure.ai.inference.aio import ChatCompletionsClient from azure.ai.inference.models import UserMessage @@ -35,16 +37,23 @@ async def main(): endpoint = os.environ["PROJECT_ENDPOINT"] model_deployment_name = os.environ["MODEL_DEPLOYMENT_NAME"] - async with DefaultAzureCredential() as credential: + # Project endpoint has the form: https://.services.ai.azure.com/api/projects/ + # Inference endpoint has the form: https://.services.ai.azure.com/models + # Strip the "/api/projects/" part and replace with "/models": + inference_endpoint = f"https://{urlparse(endpoint).netloc}/models" - async with AIProjectClient(endpoint=endpoint, credential=credential) as project_client: + async with DefaultAzureCredential() as credential: - async with project_client.inference.get_chat_completions_client() as client: + async with ChatCompletionsClient( + endpoint=inference_endpoint, + credential=credential, + credential_scopes=["https://ai.azure.com/.default"], + ) as client: - response = await client.complete( - model=model_deployment_name, messages=[UserMessage(content="How many feet are in a mile?")] - ) - print(response.choices[0].message.content) + response = await client.complete( + model=model_deployment_name, messages=[UserMessage(content="How many feet are in a mile?")] + ) + print(response.choices[0].message.content) if __name__ == "__main__": diff --git a/sdk/ai/azure-ai-projects/samples/inference/async_samples/sample_image_embeddings_with_azure_ai_inference_client_async.py b/sdk/ai/azure-ai-projects/samples/inference/azure-ai-inference/async_samples/sample_image_embeddings_with_azure_ai_inference_client_async.py similarity index 54% rename from sdk/ai/azure-ai-projects/samples/inference/async_samples/sample_image_embeddings_with_azure_ai_inference_client_async.py rename to sdk/ai/azure-ai-projects/samples/inference/azure-ai-inference/async_samples/sample_image_embeddings_with_azure_ai_inference_client_async.py index b13d72f70a8f..da0e9cd56830 100644 --- a/sdk/ai/azure-ai-projects/samples/inference/async_samples/sample_image_embeddings_with_azure_ai_inference_client_async.py +++ b/sdk/ai/azure-ai-projects/samples/inference/azure-ai-inference/async_samples/sample_image_embeddings_with_azure_ai_inference_client_async.py @@ -6,7 +6,7 @@ """ DESCRIPTION: - Given an AIProjectClient, this sample demonstrates how to get an authenticated + Given an AI Foundry Project endpoint, this sample demonstrates how to get an authenticated async ImageEmbeddingsClient from the azure.ai.inference package, and perform one image embeddings operation. For more information on the azure.ai.inference package see https://pypi.org/project/azure-ai-inference/. @@ -26,8 +26,9 @@ import os import asyncio +from urllib.parse import urlparse from azure.identity.aio import DefaultAzureCredential -from azure.ai.projects.aio import AIProjectClient +from azure.ai.inference.aio import ImageEmbeddingsClient from azure.ai.inference.models import ImageEmbeddingInput @@ -36,28 +37,35 @@ async def main(): endpoint = os.environ["PROJECT_ENDPOINT"] model_deployment_name = os.environ["MODEL_DEPLOYMENT_NAME"] + # Project endpoint has the form: https://.services.ai.azure.com/api/projects/ + # Inference endpoint has the form: https://.services.ai.azure.com/models + # Strip the "/api/projects/" part and replace with "/models": + inference_endpoint = f"https://{urlparse(endpoint).netloc}/models" + # Construct the path to the image file used in this sample data_folder = os.environ.get("DATA_FOLDER", os.path.dirname(os.path.abspath(__file__))) image_file = os.path.join(data_folder, "sample1.png") async with DefaultAzureCredential() as credential: - async with AIProjectClient(endpoint=endpoint, credential=credential) as project_client: - - async with project_client.inference.get_image_embeddings_client() as client: - - response = await client.embed( - model=model_deployment_name, - input=[ImageEmbeddingInput.load(image_file=image_file, image_format="png")], + async with ImageEmbeddingsClient( + endpoint=inference_endpoint, + credential=credential, + credential_scopes=["https://ai.azure.com/.default"], + ) as client: + + response = await client.embed( + model=model_deployment_name, + input=[ImageEmbeddingInput.load(image_file=image_file, image_format="png")], + ) + + for item in response.data: + length = len(item.embedding) + print( + f"data[{item.index}]: length={length}, [{item.embedding[0]}, {item.embedding[1]}, " + f"..., {item.embedding[length-2]}, {item.embedding[length-1]}]" ) - for item in response.data: - length = len(item.embedding) - print( - f"data[{item.index}]: length={length}, [{item.embedding[0]}, {item.embedding[1]}, " - f"..., {item.embedding[length-2]}, {item.embedding[length-1]}]" - ) - if __name__ == "__main__": asyncio.run(main()) diff --git a/sdk/ai/azure-ai-projects/samples/inference/azure-ai-inference/async_samples/sample_text_embeddings_with_azure_ai_inference_client_async.py b/sdk/ai/azure-ai-projects/samples/inference/azure-ai-inference/async_samples/sample_text_embeddings_with_azure_ai_inference_client_async.py new file mode 100644 index 000000000000..56bfa7bc52d1 --- /dev/null +++ b/sdk/ai/azure-ai-projects/samples/inference/azure-ai-inference/async_samples/sample_text_embeddings_with_azure_ai_inference_client_async.py @@ -0,0 +1,65 @@ +# pylint: disable=line-too-long,useless-suppression +# ------------------------------------ +# Copyright (c) Microsoft Corporation. +# Licensed under the MIT License. +# ------------------------------------ + +""" +DESCRIPTION: + Given an AI Foundry Project endpoint, this sample demonstrates how to get an authenticated + async EmbeddingsClient from the azure.ai.inference package, and perform one text + embeddings operation. For more information on the azure.ai.inference package see + https://pypi.org/project/azure-ai-inference/. + +USAGE: + python sample_text_embeddings_with_azure_ai_inference_client_async.py + + Before running the sample: + + pip install azure-ai-projects azure-ai-inference aiohttp azure-identity + + Set these environment variables with your own values: + 1) PROJECT_ENDPOINT - The Azure AI Project endpoint, as found in the overview page of your + Azure AI Foundry project. + 2) MODEL_DEPLOYMENT_NAME - The AI model deployment name, as found in your AI Foundry project. +""" + +import os +import asyncio +from urllib.parse import urlparse +from azure.identity.aio import DefaultAzureCredential +from azure.ai.inference.aio import EmbeddingsClient + + +async def main(): + + endpoint = os.environ["PROJECT_ENDPOINT"] + model_deployment_name = os.environ["MODEL_DEPLOYMENT_NAME"] + + # Project endpoint has the form: https://.services.ai.azure.com/api/projects/ + # Inference endpoint has the form: https://.services.ai.azure.com/models + # Strip the "/api/projects/" part and replace with "/models": + inference_endpoint = f"https://{urlparse(endpoint).netloc}/models" + + async with DefaultAzureCredential() as credential: + + async with EmbeddingsClient( + endpoint=inference_endpoint, + credential=credential, + credential_scopes=["https://ai.azure.com/.default"], + ) as client: + + response = await client.embed( + model=model_deployment_name, input=["first phrase", "second phrase", "third phrase"] + ) + + for item in response.data: + length = len(item.embedding) + print( + f"data[{item.index}]: length={length}, [{item.embedding[0]}, {item.embedding[1]}, " + f"..., {item.embedding[length-2]}, {item.embedding[length-1]}]" + ) + + +if __name__ == "__main__": + asyncio.run(main()) diff --git a/sdk/ai/azure-ai-projects/azure/ai/projects/_patch_telemetry.py b/sdk/ai/azure-ai-projects/samples/inference/azure-ai-inference/azure_ai_inference_telemetry_helper.py similarity index 82% rename from sdk/ai/azure-ai-projects/azure/ai/projects/_patch_telemetry.py rename to sdk/ai/azure-ai-projects/samples/inference/azure-ai-inference/azure_ai_inference_telemetry_helper.py index 4d67af1a22f3..c42de0e5011b 100644 --- a/sdk/ai/azure-ai-projects/azure/ai/projects/_patch_telemetry.py +++ b/sdk/ai/azure-ai-projects/samples/inference/azure-ai-inference/azure_ai_inference_telemetry_helper.py @@ -4,10 +4,7 @@ # Licensed under the MIT License. # ------------------------------------ # pylint: disable=line-too-long,R,no-member -"""Customize generated code here. -Follow our quickstart for examples: https://aka.ms/azsdk/python/dpcodegen/python/customize -""" import io import logging import sys @@ -16,18 +13,10 @@ logger = logging.getLogger(__name__) -# TODO: what about `set AZURE_TRACING_GEN_AI_CONTENT_RECORDING_ENABLED=true`? -def enable_telemetry( +def azure_ai_inference_telemetry_helper( *, destination: Union[TextIO, str, None] = None, **kwargs # pylint: disable=unused-argument ) -> None: - """Enables telemetry collection with OpenTelemetry for Azure AI clients and popular GenAI libraries. - - Following instrumentations are enabled (when corresponding packages are installed): - - - Azure AI Agents (`azure-ai-agents`) - - Azure AI Inference (`azure-ai-inference`) - - OpenAI (`opentelemetry-instrumentation-openai-v2`) - - Langchain (`opentelemetry-instrumentation-langchain`) + """Enables telemetry collection with OpenTelemetry for Azure AI Inference client (azure-ai-inference). The recording of prompt and completion messages is disabled by default. To enable it, set the `AZURE_TRACING_GEN_AI_CONTENT_RECORDING_ENABLED` environment variable to `true`. @@ -71,38 +60,8 @@ def enable_telemetry( "Could not call `AIInferenceInstrumentor().instrument()` since `azure-ai-inference` is not installed" ) - try: - from azure.ai.agents.tracing import AIAgentsInstrumentor # pylint: disable=import-error,no-name-in-module - - agents_instrumentor = AIAgentsInstrumentor() - if not agents_instrumentor.is_instrumented(): - agents_instrumentor.instrument() - except Exception as exc: # pylint: disable=broad-exception-caught - logger.warning("Could not call `AIAgentsInstrumentor().instrument()`", exc_info=exc) - - try: - from opentelemetry.instrumentation.openai_v2 import OpenAIInstrumentor # type: ignore - - OpenAIInstrumentor().instrument() - except ModuleNotFoundError: - logger.warning( - "Could not call `OpenAIInstrumentor().instrument()` since " - + "`opentelemetry-instrumentation-openai-v2` is not installed" - ) - - try: - from opentelemetry.instrumentation.langchain import LangchainInstrumentor # type: ignore - - print("Calling LangchainInstrumentor().instrument()") - LangchainInstrumentor().instrument() - except ModuleNotFoundError: - logger.warning( - "Could not call LangchainInstrumentor().instrument()` since " - + "`opentelemetry-instrumentation-langchain` is not installed" - ) - -# Internal helper functions to enable OpenTelemetry, used by both sync and async clients +# Helper functions to enable OpenTelemetry, used by both sync and async clients def _get_trace_exporter(destination: Union[TextIO, str, None]) -> Any: if isinstance(destination, str): # `destination` is the OTLP endpoint diff --git a/sdk/ai/azure-ai-projects/samples/inference/sample1.png b/sdk/ai/azure-ai-projects/samples/inference/azure-ai-inference/sample1.png similarity index 100% rename from sdk/ai/azure-ai-projects/samples/inference/sample1.png rename to sdk/ai/azure-ai-projects/samples/inference/azure-ai-inference/sample1.png diff --git a/sdk/ai/azure-ai-projects/samples/inference/sample1.prompty b/sdk/ai/azure-ai-projects/samples/inference/azure-ai-inference/sample1.prompty similarity index 100% rename from sdk/ai/azure-ai-projects/samples/inference/sample1.prompty rename to sdk/ai/azure-ai-projects/samples/inference/azure-ai-inference/sample1.prompty diff --git a/sdk/ai/azure-ai-projects/samples/inference/sample_chat_completions_with_azure_ai_inference_client.py b/sdk/ai/azure-ai-projects/samples/inference/azure-ai-inference/sample_chat_completions_with_azure_ai_inference_client.py similarity index 53% rename from sdk/ai/azure-ai-projects/samples/inference/sample_chat_completions_with_azure_ai_inference_client.py rename to sdk/ai/azure-ai-projects/samples/inference/azure-ai-inference/sample_chat_completions_with_azure_ai_inference_client.py index 14e6d834a6c1..1cf3360d5f92 100644 --- a/sdk/ai/azure-ai-projects/samples/inference/sample_chat_completions_with_azure_ai_inference_client.py +++ b/sdk/ai/azure-ai-projects/samples/inference/azure-ai-inference/sample_chat_completions_with_azure_ai_inference_client.py @@ -1,3 +1,4 @@ +# pylint: disable=line-too-long,useless-suppression # ------------------------------------ # Copyright (c) Microsoft Corporation. # Licensed under the MIT License. @@ -5,7 +6,7 @@ """ DESCRIPTION: - Given an AIProjectClient, this sample demonstrates how to get an authenticated + Given an AI Foundry Project endpoint, this sample demonstrates how to get an authenticated ChatCompletionsClient from the azure.ai.inference package and perform one chat completion operation. For more information on the azure.ai.inference package see https://pypi.org/project/azure-ai-inference/. @@ -24,23 +25,29 @@ """ import os +from urllib.parse import urlparse from azure.identity import DefaultAzureCredential -from azure.ai.projects import AIProjectClient +from azure.ai.inference import ChatCompletionsClient from azure.ai.inference.models import UserMessage endpoint = os.environ["PROJECT_ENDPOINT"] model_deployment_name = os.environ["MODEL_DEPLOYMENT_NAME"] -with DefaultAzureCredential(exclude_interactive_browser_credential=False) as credential: +# Project endpoint has the form: https://.services.ai.azure.com/api/projects/ +# Inference endpoint has the form: https://.services.ai.azure.com/models +# Strip the "/api/projects/" part and replace with "/models": +inference_endpoint = f"https://{urlparse(endpoint).netloc}/models" - with AIProjectClient(endpoint=endpoint, credential=credential) as project_client: +with DefaultAzureCredential(exclude_interactive_browser_credential=False) as credential: - # [START inference_sample] - with project_client.inference.get_chat_completions_client() as client: + with ChatCompletionsClient( + endpoint=inference_endpoint, + credential=credential, + credential_scopes=["https://ai.azure.com/.default"], + ) as client: - response = client.complete( - model=model_deployment_name, messages=[UserMessage(content="How many feet are in a mile?")] - ) + response = client.complete( + model=model_deployment_name, messages=[UserMessage(content="How many feet are in a mile?")] + ) - print(response.choices[0].message.content) - # [END inference_sample] + print(response.choices[0].message.content) diff --git a/sdk/ai/azure-ai-projects/samples/inference/sample_chat_completions_with_azure_ai_inference_client_and_azure_monitor_tracing.py b/sdk/ai/azure-ai-projects/samples/inference/azure-ai-inference/sample_chat_completions_with_azure_ai_inference_client_and_azure_monitor_tracing.py similarity index 50% rename from sdk/ai/azure-ai-projects/samples/inference/sample_chat_completions_with_azure_ai_inference_client_and_azure_monitor_tracing.py rename to sdk/ai/azure-ai-projects/samples/inference/azure-ai-inference/sample_chat_completions_with_azure_ai_inference_client_and_azure_monitor_tracing.py index 626ab0a2c9f3..530e78c9541f 100644 --- a/sdk/ai/azure-ai-projects/samples/inference/sample_chat_completions_with_azure_ai_inference_client_and_azure_monitor_tracing.py +++ b/sdk/ai/azure-ai-projects/samples/inference/azure-ai-inference/sample_chat_completions_with_azure_ai_inference_client_and_azure_monitor_tracing.py @@ -1,3 +1,4 @@ +# pylint: disable=line-too-long,useless-suppression # ------------------------------------ # Copyright (c) Microsoft Corporation. # Licensed under the MIT License. @@ -5,14 +6,15 @@ """ DESCRIPTION: - Given an AIProjectClient, this sample demonstrates how to get an authenticated + Given an AIProjectClient, this sample demonstrates how to get an authenticated ChatCompletionsClient from the azure.ai.inference package and perform one chat completion - operation. The client is already instrumented to upload traces to Azure Monitor. View the results + operation. + The client is instrumented to upload OpenTelemetry traces to Azure Monitor. View the uploaded traces in the "Tracing" tab in your Azure AI Foundry project page. For more information on the azure.ai.inference package see https://pypi.org/project/azure-ai-inference/. USAGE: - sample_chat_completions_with_azure_ai_inference_client_and_azure_monitor_tracing.py + python sample_chat_completions_with_azure_ai_inference_client_and_azure_monitor_tracing.py Before running the sample: @@ -27,32 +29,39 @@ """ import os +from urllib.parse import urlparse from azure.identity import DefaultAzureCredential -from azure.ai.projects import AIProjectClient, enable_telemetry +from azure.ai.projects import AIProjectClient +from azure.ai.inference import ChatCompletionsClient from azure.ai.inference.models import UserMessage from azure.monitor.opentelemetry import configure_azure_monitor +from opentelemetry import trace -# Enable additional instrumentations for openai and langchain -# which are not included by Azure Monitor out of the box -enable_telemetry() +file_name = os.path.basename(__file__) +tracer = trace.get_tracer(__name__) endpoint = os.environ["PROJECT_ENDPOINT"] model_deployment_name = os.environ["MODEL_DEPLOYMENT_NAME"] -with DefaultAzureCredential(exclude_interactive_browser_credential=False) as credential: +with tracer.start_as_current_span(file_name): - with AIProjectClient(endpoint=endpoint, credential=credential) as project_client: + with DefaultAzureCredential(exclude_interactive_browser_credential=False) as credential: - # Enable Azure Monitor tracing - application_insights_connection_string = project_client.telemetry.get_connection_string() - if not application_insights_connection_string: - print("Application Insights was not enabled for this project.") - print("Enable it via the 'Tracing' tab in your AI Foundry project page.") - exit() + with AIProjectClient(endpoint=endpoint, credential=credential) as project_client: - configure_azure_monitor(connection_string=application_insights_connection_string) + application_insights_connection_string = project_client.telemetry.get_connection_string() + configure_azure_monitor(connection_string=application_insights_connection_string) - with project_client.inference.get_chat_completions_client() as client: + # Project endpoint has the form: https://.services.ai.azure.com/api/projects/ + # Inference endpoint has the form: https://.services.ai.azure.com/models + # Strip the "/api/projects/" part and replace with "/models": + inference_endpoint = f"https://{urlparse(endpoint).netloc}/models" + + with ChatCompletionsClient( + endpoint=inference_endpoint, + credential=credential, + credential_scopes=["https://ai.azure.com/.default"], + ) as client: response = client.complete( model=model_deployment_name, messages=[UserMessage(content="How many feet are in a mile?")] diff --git a/sdk/ai/azure-ai-projects/samples/inference/azure-ai-inference/sample_chat_completions_with_azure_ai_inference_client_and_console_tracing.py b/sdk/ai/azure-ai-projects/samples/inference/azure-ai-inference/sample_chat_completions_with_azure_ai_inference_client_and_console_tracing.py new file mode 100644 index 000000000000..a460d2480ab6 --- /dev/null +++ b/sdk/ai/azure-ai-projects/samples/inference/azure-ai-inference/sample_chat_completions_with_azure_ai_inference_client_and_console_tracing.py @@ -0,0 +1,82 @@ +# pylint: disable=line-too-long,useless-suppression +# ------------------------------------ +# Copyright (c) Microsoft Corporation. +# Licensed under the MIT License. +# ------------------------------------ + +""" +DESCRIPTION: + Given an AI Foundry Project endpoint, this sample demonstrates how to get an authenticated + ChatCompletionsClient from the azure.ai.inference package and perform one chat completion + operation. It also shows how to turn on local console tracing. + For more information on the azure.ai.inference package see https://pypi.org/project/azure-ai-inference/. + +USAGE: + python sample_chat_completions_with_azure_ai_inference_client_and_console_tracing.py + + Before running the sample: + + pip install azure-ai-inference azure-identity opentelemetry.sdk + + Set these environment variables with your own values: + 1) PROJECT_ENDPOINT - The Azure AI Project endpoint, as found in the overview page of your + Azure AI Foundry project. + 2) MODEL_DEPLOYMENT_NAME - The AI model deployment name, as found in your AI Foundry project. + +ALTERNATIVE USAGE: + If you want to export telemetry to OTLP endpoint (such as Aspire dashboard + https://learn.microsoft.com/dotnet/aspire/fundamentals/dashboard/standalone?tabs=bash) + instead of to the console, also install: + + pip install opentelemetry-exporter-otlp-proto-grpc + + And also define: + 3) OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT - Optional. Set to `true` to trace the content of chat + messages, which may contain personal data. False by default. +""" + +import os +from azure.core.settings import settings +from urllib.parse import urlparse +from azure.identity import DefaultAzureCredential +from azure.ai.inference import ChatCompletionsClient +from azure.ai.inference.tracing import AIInferenceInstrumentor +from azure.ai.inference.models import UserMessage +from opentelemetry import trace +from opentelemetry.sdk.trace import TracerProvider +from opentelemetry.sdk.trace.export import SimpleSpanProcessor, ConsoleSpanExporter + +settings.tracing_implementation = "opentelemetry" + +span_exporter = ConsoleSpanExporter() +tracer_provider = TracerProvider() +tracer_provider.add_span_processor(SimpleSpanProcessor(span_exporter)) +trace.set_tracer_provider(tracer_provider) +tracer = trace.get_tracer(__name__) +file_name = os.path.basename(__file__) + +AIInferenceInstrumentor().instrument() + +endpoint = os.environ["PROJECT_ENDPOINT"] +model_deployment_name = os.environ["MODEL_DEPLOYMENT_NAME"] + +with tracer.start_as_current_span(file_name): + + with DefaultAzureCredential(exclude_interactive_browser_credential=False) as credential: + + # Project endpoint has the form: https://.services.ai.azure.com/api/projects/ + # Inference endpoint has the form: https://.services.ai.azure.com/models + # Strip the "/api/projects/" part and replace with "/models": + inference_endpoint = f"https://{urlparse(endpoint).netloc}/models" + + with ChatCompletionsClient( + endpoint=inference_endpoint, + credential=credential, + credential_scopes=["https://ai.azure.com/.default"], + ) as client: + + response = client.complete( + model=model_deployment_name, messages=[UserMessage(content="How many feet are in a mile?")] + ) + + print(response.choices[0].message.content) diff --git a/sdk/ai/azure-ai-projects/samples/inference/azure-ai-inference/sample_chat_completions_with_azure_ai_inference_client_and_prompt_string.py b/sdk/ai/azure-ai-projects/samples/inference/azure-ai-inference/sample_chat_completions_with_azure_ai_inference_client_and_prompt_string.py new file mode 100644 index 000000000000..9b4bc277d796 --- /dev/null +++ b/sdk/ai/azure-ai-projects/samples/inference/azure-ai-inference/sample_chat_completions_with_azure_ai_inference_client_and_prompt_string.py @@ -0,0 +1,89 @@ +# pylint: disable=line-too-long,useless-suppression +# ------------------------------------ +# Copyright (c) Microsoft Corporation. +# Licensed under the MIT License. +# ------------------------------------ + +""" +DESCRIPTION: + Given an AI Foundry Project endpoint, this sample demonstrates how to + * Get an authenticated ChatCompletionsClient from the azure.ai.inference package + * Define a Mustache template, and render the template with provided parameters to create a list of chat messages. + * Perform one chat completion operation. + Package azure.ai.inference required. For more information see https://pypi.org/project/azure-ai-inference/. + Package prompty required. For more information see https://pypi.org/project/prompty/. + +USAGE: + python sample_chat_completions_with_azure_ai_inference_client_and_prompt_string.py + + Before running the sample: + + pip install azure-ai-projects azure-ai-inference azure-identity prompty + + Set these environment variables with your own values: + 1) PROJECT_ENDPOINT - The Azure AI Project endpoint, as found in the overview page of your + Azure AI Foundry project. + 2) MODEL_DEPLOYMENT_NAME - The AI model deployment name, as found in your AI Foundry project. +""" + +import os +from urllib.parse import urlparse +from azure.identity import DefaultAzureCredential +from azure.ai.projects import PromptTemplate +from azure.ai.inference import ChatCompletionsClient + +endpoint = os.environ["PROJECT_ENDPOINT"] +model_deployment_name = os.environ["MODEL_DEPLOYMENT_NAME"] + +# Project endpoint has the form: https://.services.ai.azure.com/api/projects/ +# Inference endpoint has the form: https://.services.ai.azure.com/models +# Strip the "/api/projects/" part and replace with "/models": +inference_endpoint = f"https://{urlparse(endpoint).netloc}/models" + +with DefaultAzureCredential(exclude_interactive_browser_credential=False) as credential: + + prompt_template_str = """ + system: + You are an AI assistant in a hotel. You help guests with their requests and provide information about the hotel and its services. + + # context + {{#rules}} + {{rule}} + {{/rules}} + + {{#chat_history}} + {{role}}: + {{content}} + {{/chat_history}} + + user: + {{input}} + """ + + prompt_template = PromptTemplate.from_string(api="chat", prompt_template=prompt_template_str) + + input = "When I arrived, can I still have breakfast?" + + rules = [ + {"rule": "The check-in time is 3pm"}, + {"rule": "The check-out time is 11am"}, + {"rule": "Breakfast is served from 7am to 10am"}, + ] + + chat_history = [ + {"role": "user", "content": "I'll arrive at 2pm. What's the check-in and check-out time?"}, + {"role": "system", "content": "The check-in time is 3 PM, and the check-out time is 11 AM."}, + ] + + messages = prompt_template.create_messages(input=input, rules=rules, chat_history=chat_history) + print(messages) + + with ChatCompletionsClient( + endpoint=inference_endpoint, + credential=credential, + credential_scopes=["https://ai.azure.com/.default"], + ) as client: + + response = client.complete(model=model_deployment_name, messages=messages) + + print(response.choices[0].message.content) diff --git a/sdk/ai/azure-ai-projects/samples/inference/sample_chat_completions_with_azure_ai_inference_client_and_prompty_file.py b/sdk/ai/azure-ai-projects/samples/inference/azure-ai-inference/sample_chat_completions_with_azure_ai_inference_client_and_prompty_file.py similarity index 50% rename from sdk/ai/azure-ai-projects/samples/inference/sample_chat_completions_with_azure_ai_inference_client_and_prompty_file.py rename to sdk/ai/azure-ai-projects/samples/inference/azure-ai-inference/sample_chat_completions_with_azure_ai_inference_client_and_prompty_file.py index 03dc13c53db2..5a18eec03e2d 100644 --- a/sdk/ai/azure-ai-projects/samples/inference/sample_chat_completions_with_azure_ai_inference_client_and_prompty_file.py +++ b/sdk/ai/azure-ai-projects/samples/inference/azure-ai-inference/sample_chat_completions_with_azure_ai_inference_client_and_prompty_file.py @@ -1,3 +1,4 @@ +# pylint: disable=line-too-long,useless-suppression # ------------------------------------ # Copyright (c) Microsoft Corporation. # Licensed under the MIT License. @@ -5,7 +6,7 @@ """ DESCRIPTION: - Given an AIProjectClient, this sample demonstrates how to + Given an AI Foundry Project endpoint, this sample demonstrates how to * Get an authenticated ChatCompletionsClient from the azure.ai.inference package * Load a Prompty file and render a template with provided parameters to create a list of chat messages. * Perform one chat completion operation. @@ -27,36 +28,49 @@ """ import os +from urllib.parse import urlparse from azure.identity import DefaultAzureCredential -from azure.ai.projects import AIProjectClient, PromptTemplate +from azure.ai.projects import PromptTemplate +from azure.ai.inference import ChatCompletionsClient endpoint = os.environ["PROJECT_ENDPOINT"] model_deployment_name = os.environ["MODEL_DEPLOYMENT_NAME"] +# Project endpoint has the form: https://.services.ai.azure.com/api/projects/ +# Inference endpoint has the form: https://.services.ai.azure.com/models +# Strip the "/api/projects/" part and replace with "/models": +inference_endpoint = f"https://{urlparse(endpoint).netloc}/models" + # Construct the path to the Prompty file used in this sample data_folder = os.environ.get("DATA_FOLDER", os.path.dirname(os.path.abspath(__file__))) prompty_file = os.path.join(data_folder, "sample1.prompty") with DefaultAzureCredential(exclude_interactive_browser_credential=False) as credential: - with AIProjectClient(endpoint=endpoint, credential=credential) as project_client: + prompt_template = PromptTemplate.from_prompty(file_path=prompty_file) + + input = "When I arrived, can I still have breakfast?" + + rules = [ + {"rule": "The check-in time is 3pm"}, + {"rule": "The check-out time is 11am"}, + {"rule": "Breakfast is served from 7am to 10am"}, + ] + + chat_history = [ + {"role": "user", "content": "I'll arrive at 2pm. What's the check-in and check-out time?"}, + {"role": "system", "content": "The check-in time is 3 PM, and the check-out time is 11 AM."}, + ] - with project_client.inference.get_chat_completions_client() as client: + messages = prompt_template.create_messages(input=input, rules=rules, chat_history=chat_history) + print(messages) - prompt_template = PromptTemplate.from_prompty(file_path=prompty_file) + with ChatCompletionsClient( + endpoint=inference_endpoint, + credential=credential, + credential_scopes=["https://ai.azure.com/.default"], + ) as client: - input = "When I arrived, can I still have breakfast?" - rules = [ - {"rule": "The check-in time is 3pm"}, - {"rule": "The check-out time is 11am"}, - {"rule": "Breakfast is served from 7am to 10am"}, - ] - chat_history = [ - {"role": "user", "content": "I'll arrive at 2pm. What's the check-in and check-out time?"}, - {"role": "system", "content": "The check-in time is 3 PM, and the check-out time is 11 AM."}, - ] - messages = prompt_template.create_messages(input=input, rules=rules, chat_history=chat_history) - print(messages) - response = client.complete(model=model_deployment_name, messages=messages) + response = client.complete(model=model_deployment_name, messages=messages) - print(response.choices[0].message.content) + print(response.choices[0].message.content) diff --git a/sdk/ai/azure-ai-projects/samples/inference/sample_image_embeddings_with_azure_ai_inference_client.py b/sdk/ai/azure-ai-projects/samples/inference/azure-ai-inference/sample_image_embeddings_with_azure_ai_inference_client.py similarity index 54% rename from sdk/ai/azure-ai-projects/samples/inference/sample_image_embeddings_with_azure_ai_inference_client.py rename to sdk/ai/azure-ai-projects/samples/inference/azure-ai-inference/sample_image_embeddings_with_azure_ai_inference_client.py index 50e642e33c9e..8c65c7890a15 100644 --- a/sdk/ai/azure-ai-projects/samples/inference/sample_image_embeddings_with_azure_ai_inference_client.py +++ b/sdk/ai/azure-ai-projects/samples/inference/azure-ai-inference/sample_image_embeddings_with_azure_ai_inference_client.py @@ -1,3 +1,4 @@ +# pylint: disable=line-too-long,useless-suppression # ------------------------------------ # Copyright (c) Microsoft Corporation. # Licensed under the MIT License. @@ -5,7 +6,7 @@ """ DESCRIPTION: - Given an AIProjectClient, this sample demonstrates how to get an authenticated + Given an AI Foundry Project endpoint, this sample demonstrates how to get an authenticated ImageEmbeddingsClient from the azure.ai.inference package, and perform one image embeddings operation. For more information on the azure.ai.inference package see https://pypi.org/project/azure-ai-inference/. @@ -25,31 +26,39 @@ """ import os +from urllib.parse import urlparse from azure.identity import DefaultAzureCredential -from azure.ai.projects import AIProjectClient +from azure.ai.inference import ImageEmbeddingsClient from azure.ai.inference.models import ImageEmbeddingInput endpoint = os.environ["PROJECT_ENDPOINT"] model_deployment_name = os.environ["MODEL_DEPLOYMENT_NAME"] +# Project endpoint has the form: https://.services.ai.azure.com/api/projects/ +# Inference endpoint has the form: https://.services.ai.azure.com/models +# Strip the "/api/projects/" part and replace with "/models": +inference_endpoint = f"https://{urlparse(endpoint).netloc}/models" + # Construct the path to the image file used in this sample data_folder = os.environ.get("DATA_FOLDER", os.path.dirname(os.path.abspath(__file__))) image_file = os.path.join(data_folder, "sample1.png") with DefaultAzureCredential(exclude_interactive_browser_credential=False) as credential: - with AIProjectClient(endpoint=endpoint, credential=credential) as project_client: + with ImageEmbeddingsClient( + endpoint=inference_endpoint, + credential=credential, + credential_scopes=["https://ai.azure.com/.default"], + ) as client: - with project_client.inference.get_image_embeddings_client() as client: + response = client.embed( + model=model_deployment_name, + input=[ImageEmbeddingInput.load(image_file=image_file, image_format="png")], + ) - response = client.embed( - model=model_deployment_name, - input=[ImageEmbeddingInput.load(image_file=image_file, image_format="png")], + for item in response.data: + length = len(item.embedding) + print( + f"data[{item.index}]: length={length}, [{item.embedding[0]}, {item.embedding[1]}, " + f"..., {item.embedding[length-2]}, {item.embedding[length-1]}]" ) - - for item in response.data: - length = len(item.embedding) - print( - f"data[{item.index}]: length={length}, [{item.embedding[0]}, {item.embedding[1]}, " - f"..., {item.embedding[length-2]}, {item.embedding[length-1]}]" - ) diff --git a/sdk/ai/azure-ai-projects/samples/inference/azure-ai-inference/sample_text_embeddings_with_azure_ai_inference_client.py b/sdk/ai/azure-ai-projects/samples/inference/azure-ai-inference/sample_text_embeddings_with_azure_ai_inference_client.py new file mode 100644 index 000000000000..779a895852bc --- /dev/null +++ b/sdk/ai/azure-ai-projects/samples/inference/azure-ai-inference/sample_text_embeddings_with_azure_ai_inference_client.py @@ -0,0 +1,55 @@ +# pylint: disable=line-too-long,useless-suppression +# ------------------------------------ +# Copyright (c) Microsoft Corporation. +# Licensed under the MIT License. +# ------------------------------------ + +""" +DESCRIPTION: + Given an AI Foundry Project endpoint, this sample demonstrates how to get an authenticated + EmbeddingsClient from the azure.ai.inference package, and perform one text embeddings + operation. For more information on the azure.ai.inference package see + https://pypi.org/project/azure-ai-inference/. + +USAGE: + python sample_text_embeddings_with_azure_ai_inference_client.py + + Before running the sample: + + pip install azure-ai-projects azure-ai-inference azure-identity + + Set these environment variables with your own values: + 1) PROJECT_ENDPOINT - The Azure AI Project endpoint, as found in the overview page of your + Azure AI Foundry project. + 2) MODEL_DEPLOYMENT_NAME - The AI model deployment name, as found in your AI Foundry project. +""" + +import os +from urllib.parse import urlparse +from azure.identity import DefaultAzureCredential +from azure.ai.inference import EmbeddingsClient + +endpoint = os.environ["PROJECT_ENDPOINT"] +model_deployment_name = os.environ["MODEL_DEPLOYMENT_NAME"] + +# Project endpoint has the form: https://.services.ai.azure.com/api/projects/ +# Inference endpoint has the form: https://.services.ai.azure.com/models +# Strip the "/api/projects/" part and replace with "/models": +inference_endpoint = f"https://{urlparse(endpoint).netloc}/models" + +with DefaultAzureCredential(exclude_interactive_browser_credential=False) as credential: + + with EmbeddingsClient( + endpoint=inference_endpoint, + credential=credential, + credential_scopes=["https://ai.azure.com/.default"], + ) as client: + + response = client.embed(model=model_deployment_name, input=["first phrase", "second phrase", "third phrase"]) + + for item in response.data: + length = len(item.embedding) + print( + f"data[{item.index}]: length={length}, [{item.embedding[0]}, {item.embedding[1]}, " + f"..., {item.embedding[length-2]}, {item.embedding[length-1]}]" + ) diff --git a/sdk/ai/azure-ai-projects/samples/inference/async_samples/sample_chat_completions_with_azure_openai_client_async.py b/sdk/ai/azure-ai-projects/samples/inference/azure-openai/async_samples/sample_chat_completions_with_azure_openai_client_async.py similarity index 97% rename from sdk/ai/azure-ai-projects/samples/inference/async_samples/sample_chat_completions_with_azure_openai_client_async.py rename to sdk/ai/azure-ai-projects/samples/inference/azure-openai/async_samples/sample_chat_completions_with_azure_openai_client_async.py index f04b27b79b94..5c88c125412d 100644 --- a/sdk/ai/azure-ai-projects/samples/inference/async_samples/sample_chat_completions_with_azure_openai_client_async.py +++ b/sdk/ai/azure-ai-projects/samples/inference/azure-openai/async_samples/sample_chat_completions_with_azure_openai_client_async.py @@ -1,3 +1,4 @@ +# pylint: disable=line-too-long,useless-suppression # ------------------------------------ # Copyright (c) Microsoft Corporation. # Licensed under the MIT License. @@ -5,7 +6,7 @@ """ DESCRIPTION: - Given an AIProjectClient, this sample demonstrates how to get an authenticated + Given an AIProjectClient, this sample demonstrates how to get an authenticated AsyncAzureOpenAI client from the openai package, and perform one chat completions operation. diff --git a/sdk/ai/azure-ai-projects/samples/inference/sample_chat_completions_with_azure_openai_client.py b/sdk/ai/azure-ai-projects/samples/inference/azure-openai/sample_chat_completions_with_azure_openai_client.py similarity index 97% rename from sdk/ai/azure-ai-projects/samples/inference/sample_chat_completions_with_azure_openai_client.py rename to sdk/ai/azure-ai-projects/samples/inference/azure-openai/sample_chat_completions_with_azure_openai_client.py index 2a43b20b8a32..f3efc1c1cdcc 100644 --- a/sdk/ai/azure-ai-projects/samples/inference/sample_chat_completions_with_azure_openai_client.py +++ b/sdk/ai/azure-ai-projects/samples/inference/azure-openai/sample_chat_completions_with_azure_openai_client.py @@ -1,3 +1,4 @@ +# pylint: disable=line-too-long,useless-suppression # ------------------------------------ # Copyright (c) Microsoft Corporation. # Licensed under the MIT License. @@ -5,7 +6,7 @@ """ DESCRIPTION: - Given an AIProjectClient, this sample demonstrates how to get an authenticated + Given an AIProjectClient, this sample demonstrates how to get an authenticated AzureOpenAI client from the openai package, and perform one chat completion operation. USAGE: diff --git a/sdk/ai/azure-ai-projects/samples/inference/sample_chat_completions_with_azure_openai_client_and_azure_monitor_tracing.py b/sdk/ai/azure-ai-projects/samples/inference/azure-openai/sample_chat_completions_with_azure_openai_client_and_azure_monitor_tracing.py similarity index 50% rename from sdk/ai/azure-ai-projects/samples/inference/sample_chat_completions_with_azure_openai_client_and_azure_monitor_tracing.py rename to sdk/ai/azure-ai-projects/samples/inference/azure-openai/sample_chat_completions_with_azure_openai_client_and_azure_monitor_tracing.py index fd3a52e60e85..800a587da5de 100644 --- a/sdk/ai/azure-ai-projects/samples/inference/sample_chat_completions_with_azure_openai_client_and_azure_monitor_tracing.py +++ b/sdk/ai/azure-ai-projects/samples/inference/azure-openai/sample_chat_completions_with_azure_openai_client_and_azure_monitor_tracing.py @@ -5,9 +5,9 @@ """ DESCRIPTION: - Given an AIProjectClient, this sample demonstrates how to get an authenticated + Given an AIProjectClient, this sample demonstrates how to get an authenticated AzureOpenAI client from the openai package, and perform one chat completion operation. - The client is already instrumented to upload traces to Azure Monitor. View the results + The client is instrumented to upload OpenTelemetry traces to Azure Monitor. View the uploaded traces in the "Tracing" tab in your Azure AI Foundry project page. @@ -16,7 +16,7 @@ Before running the sample: - pip install azure-ai-projects openai azure-monitor-opentelemetry opentelemetry-instrumentation-openai-v2 + pip install azure-ai-projects openai azure-monitor-opentelemetry opentelemetry-instrumentation-openai-v2 httpx Set these environment variables with your own values: 1) PROJECT_ENDPOINT - The Azure AI Project endpoint, as found in the overview page of your @@ -30,40 +30,40 @@ """ import os -from azure.ai.projects import AIProjectClient, enable_telemetry +from azure.ai.projects import AIProjectClient from azure.identity import DefaultAzureCredential +from opentelemetry import trace from azure.monitor.opentelemetry import configure_azure_monitor -# Enable additional instrumentations for openai and langchain -# which are not included by Azure Monitor out of the box -enable_telemetry() +from opentelemetry.instrumentation.openai_v2 import OpenAIInstrumentor + +OpenAIInstrumentor().instrument() + +file_name = os.path.basename(__file__) +tracer = trace.get_tracer(__name__) endpoint = os.environ["PROJECT_ENDPOINT"] model_deployment_name = os.environ["MODEL_DEPLOYMENT_NAME"] -with DefaultAzureCredential(exclude_interactive_browser_credential=False) as credential: +with tracer.start_as_current_span(file_name): - with AIProjectClient(endpoint=endpoint, credential=credential) as project_client: + with DefaultAzureCredential(exclude_interactive_browser_credential=False) as credential: - # Enable Azure Monitor tracing - application_insights_connection_string = project_client.telemetry.get_connection_string() - if not application_insights_connection_string: - print("Application Insights was not enabled for this project.") - print("Enable it via the 'Tracing' tab in your AI Foundry project page.") - exit() + with AIProjectClient(endpoint=endpoint, credential=credential) as project_client: - configure_azure_monitor(connection_string=application_insights_connection_string) + application_insights_connection_string = project_client.telemetry.get_connection_string() + configure_azure_monitor(connection_string=application_insights_connection_string) - with project_client.inference.get_azure_openai_client(api_version="2024-10-21") as client: + with project_client.inference.get_azure_openai_client(api_version="2024-10-21") as client: - response = client.chat.completions.create( - model=model_deployment_name, - messages=[ - { - "role": "user", - "content": "How many feet are in a mile?", - }, - ], - ) + response = client.chat.completions.create( + model=model_deployment_name, + messages=[ + { + "role": "user", + "content": "How many feet are in a mile?", + }, + ], + ) - print(response.choices[0].message.content) + print(response.choices[0].message.content) diff --git a/sdk/ai/azure-ai-projects/samples/inference/sample_chat_completions_with_azure_openai_client_and_console_tracing.py b/sdk/ai/azure-ai-projects/samples/inference/azure-openai/sample_chat_completions_with_azure_openai_client_and_console_tracing.py similarity index 53% rename from sdk/ai/azure-ai-projects/samples/inference/sample_chat_completions_with_azure_openai_client_and_console_tracing.py rename to sdk/ai/azure-ai-projects/samples/inference/azure-openai/sample_chat_completions_with_azure_openai_client_and_console_tracing.py index e137fdc701db..be5f4c9249ea 100644 --- a/sdk/ai/azure-ai-projects/samples/inference/sample_chat_completions_with_azure_openai_client_and_console_tracing.py +++ b/sdk/ai/azure-ai-projects/samples/inference/azure-openai/sample_chat_completions_with_azure_openai_client_and_console_tracing.py @@ -5,7 +5,7 @@ """ DESCRIPTION: - Given an AIProjectClient, this sample demonstrates how to get an authenticated + Given an AIProjectClient, this sample demonstrates how to get an authenticated AzureOpenAI client from the openai package, and perform one chat completion operation. The client is already instrumented with console OpenTelemetry tracing. @@ -14,7 +14,7 @@ Before running the sample: - pip install azure-ai-projects openai opentelemetry-sdk opentelemetry-instrumentation-openai-v2 + pip install azure-ai-projects openai opentelemetry-sdk opentelemetry-instrumentation-openai-v2 httpx Set these environment variables with your own values: 1) PROJECT_ENDPOINT - The Azure AI Project endpoint, as found in the overview page of your @@ -37,32 +37,44 @@ """ import os -import sys -from azure.ai.projects import AIProjectClient, enable_telemetry +from azure.core.settings import settings +from azure.ai.projects import AIProjectClient from azure.identity import DefaultAzureCredential +from opentelemetry import trace +from opentelemetry.instrumentation.openai_v2 import OpenAIInstrumentor +from opentelemetry.sdk.trace.export import SimpleSpanProcessor, ConsoleSpanExporter +from opentelemetry.sdk.trace import TracerProvider -# Enable console tracing. -# or, if you have local OTLP endpoint running, change it to -# enable_telemetry(destination="http://localhost:4317") -enable_telemetry(destination=sys.stdout) +settings.tracing_implementation = "opentelemetry" + +span_exporter = ConsoleSpanExporter() +tracer_provider = TracerProvider() +tracer_provider.add_span_processor(SimpleSpanProcessor(span_exporter)) +trace.set_tracer_provider(tracer_provider) +tracer = trace.get_tracer(__name__) +file_name = os.path.basename(__file__) + +OpenAIInstrumentor().instrument() endpoint = os.environ["PROJECT_ENDPOINT"] model_deployment_name = os.environ["MODEL_DEPLOYMENT_NAME"] -with DefaultAzureCredential(exclude_interactive_browser_credential=False) as credential: +with tracer.start_as_current_span(file_name): + + with DefaultAzureCredential(exclude_interactive_browser_credential=False) as credential: - with AIProjectClient(endpoint=endpoint, credential=credential) as project_client: + with AIProjectClient(endpoint=endpoint, credential=credential) as project_client: - with project_client.inference.get_azure_openai_client(api_version="2024-10-21") as client: + with project_client.inference.get_azure_openai_client(api_version="2024-10-21") as client: - response = client.chat.completions.create( - model=model_deployment_name, - messages=[ - { - "role": "user", - "content": "How many feet are in a mile?", - }, - ], - ) + response = client.chat.completions.create( + model=model_deployment_name, + messages=[ + { + "role": "user", + "content": "How many feet are in a mile?", + }, + ], + ) - print(response.choices[0].message.content) + print(response.choices[0].message.content) diff --git a/sdk/ai/azure-ai-projects/samples/inference/sample_chat_completions_with_azure_ai_inference_client_and_console_tracing.py b/sdk/ai/azure-ai-projects/samples/inference/sample_chat_completions_with_azure_ai_inference_client_and_console_tracing.py deleted file mode 100644 index 708878cba79d..000000000000 --- a/sdk/ai/azure-ai-projects/samples/inference/sample_chat_completions_with_azure_ai_inference_client_and_console_tracing.py +++ /dev/null @@ -1,61 +0,0 @@ -# ------------------------------------ -# Copyright (c) Microsoft Corporation. -# Licensed under the MIT License. -# ------------------------------------ - -""" -DESCRIPTION: - Given an AIProjectClient, this sample demonstrates how to get an authenticated - ChatCompletionsClient from the azure.ai.inference package and perform one chat completion - operation. It also shows how to turn on local tracing. - For more information on the azure.ai.inference package see https://pypi.org/project/azure-ai-inference/. - -USAGE: - sample_chat_completions_with_azure_ai_inference_client_and_console_tracing.py - - Before running the sample: - - pip install azure-ai-projects azure-ai-inference azure-identity - - Set these environment variables with your own values: - 1) PROJECT_ENDPOINT - The Azure AI Project endpoint, as found in the overview page of your - Azure AI Foundry project. - 2) MODEL_DEPLOYMENT_NAME - The AI model deployment name, as found in your AI Foundry project. - -ALTERNATIVE USAGE: - If you want to export telemetry to OTLP endpoint (such as Aspire dashboard - https://learn.microsoft.com/dotnet/aspire/fundamentals/dashboard/standalone?tabs=bash) - instead of to the console, also install: - - pip install opentelemetry-exporter-otlp-proto-grpc - - And also define: - 3) OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT - Optional. Set to `true` to trace the content of chat - messages, which may contain personal data. False by default. -""" - -import os -import sys -from azure.identity import DefaultAzureCredential -from azure.ai.projects import AIProjectClient, enable_telemetry -from azure.ai.inference.models import UserMessage - -# Enable console tracing. -# or, if you have local OTLP endpoint running, change it to -# enable_telemetry(destination="http://localhost:4317") -enable_telemetry(destination=sys.stdout) - -endpoint = os.environ["PROJECT_ENDPOINT"] -model_deployment_name = os.environ["MODEL_DEPLOYMENT_NAME"] - -with DefaultAzureCredential(exclude_interactive_browser_credential=False) as credential: - - with AIProjectClient(endpoint=endpoint, credential=credential) as project_client: - - with project_client.inference.get_chat_completions_client() as client: - - response = client.complete( - model=model_deployment_name, messages=[UserMessage(content="How many feet are in a mile?")] - ) - - print(response.choices[0].message.content) diff --git a/sdk/ai/azure-ai-projects/samples/inference/sample_chat_completions_with_azure_ai_inference_client_and_prompt_string.py b/sdk/ai/azure-ai-projects/samples/inference/sample_chat_completions_with_azure_ai_inference_client_and_prompt_string.py deleted file mode 100644 index 3cb141681082..000000000000 --- a/sdk/ai/azure-ai-projects/samples/inference/sample_chat_completions_with_azure_ai_inference_client_and_prompt_string.py +++ /dev/null @@ -1,76 +0,0 @@ -# pylint: disable=line-too-long,useless-suppression -# ------------------------------------ -# Copyright (c) Microsoft Corporation. -# Licensed under the MIT License. -# ------------------------------------ - -""" -DESCRIPTION: - Given an AIProjectClient, this sample demonstrates how to - * Get an authenticated ChatCompletionsClient from the azure.ai.inference package - * Define a Mustache template, and render the template with provided parameters to create a list of chat messages. - * Perform one chat completion operation. - Package azure.ai.inference required. For more information see https://pypi.org/project/azure-ai-inference/. - Package prompty required. For more information see https://pypi.org/project/prompty/. - -USAGE: - sample_chat_completions_with_azure_ai_inference_client_and_prompt_string.py - - Before running the sample: - - pip install azure-ai-projects azure-ai-inference azure-identity prompty - - Set these environment variables with your own values: - 1) PROJECT_ENDPOINT - The Azure AI Project endpoint, as found in the overview page of your - Azure AI Foundry project. - 2) MODEL_DEPLOYMENT_NAME - The AI model deployment name, as found in your AI Foundry project. -""" - -import os -from azure.identity import DefaultAzureCredential -from azure.ai.projects import AIProjectClient, PromptTemplate - -endpoint = os.environ["PROJECT_ENDPOINT"] -model_deployment_name = os.environ["MODEL_DEPLOYMENT_NAME"] - -with DefaultAzureCredential(exclude_interactive_browser_credential=False) as credential: - - with AIProjectClient(endpoint=endpoint, credential=credential) as project_client: - - with project_client.inference.get_chat_completions_client() as client: - - prompt_template_str = """ - system: - You are an AI assistant in a hotel. You help guests with their requests and provide information about the hotel and its services. - - # context - {{#rules}} - {{rule}} - {{/rules}} - - {{#chat_history}} - {{role}}: - {{content}} - {{/chat_history}} - - user: - {{input}} - """ - prompt_template = PromptTemplate.from_string(api="chat", prompt_template=prompt_template_str) - - input = "When I arrived, can I still have breakfast?" - rules = [ - {"rule": "The check-in time is 3pm"}, - {"rule": "The check-out time is 11am"}, - {"rule": "Breakfast is served from 7am to 10am"}, - ] - chat_history = [ - {"role": "user", "content": "I'll arrive at 2pm. What's the check-in and check-out time?"}, - {"role": "system", "content": "The check-in time is 3 PM, and the check-out time is 11 AM."}, - ] - messages = prompt_template.create_messages(input=input, rules=rules, chat_history=chat_history) - print(messages) - - response = client.complete(model=model_deployment_name, messages=messages) - - print(response.choices[0].message.content) diff --git a/sdk/ai/azure-ai-projects/samples/inference/sample_text_embeddings_with_azure_ai_inference_client.py b/sdk/ai/azure-ai-projects/samples/inference/sample_text_embeddings_with_azure_ai_inference_client.py deleted file mode 100644 index acb1cd8b6a1b..000000000000 --- a/sdk/ai/azure-ai-projects/samples/inference/sample_text_embeddings_with_azure_ai_inference_client.py +++ /dev/null @@ -1,48 +0,0 @@ -# ------------------------------------ -# Copyright (c) Microsoft Corporation. -# Licensed under the MIT License. -# ------------------------------------ - -""" -DESCRIPTION: - Given an AIProjectClient, this sample demonstrates how to get an authenticated - EmbeddingsClient from the azure.ai.inference package, and perform one text embeddings - operation. For more information on the azure.ai.inference package see - https://pypi.org/project/azure-ai-inference/. - -USAGE: - python sample_text_embeddings_with_azure_ai_inference_client.py - - Before running the sample: - - pip install azure-ai-projects azure-ai-inference azure-identity - - Set these environment variables with your own values: - 1) PROJECT_ENDPOINT - The Azure AI Project endpoint, as found in the overview page of your - Azure AI Foundry project. - 2) MODEL_DEPLOYMENT_NAME - The AI model deployment name, as found in your AI Foundry project. -""" - -import os -from azure.identity import DefaultAzureCredential -from azure.ai.projects import AIProjectClient - -endpoint = os.environ["PROJECT_ENDPOINT"] -model_deployment_name = os.environ["MODEL_DEPLOYMENT_NAME"] - -with DefaultAzureCredential(exclude_interactive_browser_credential=False) as credential: - - with AIProjectClient(endpoint=endpoint, credential=credential) as project_client: - - with project_client.inference.get_embeddings_client() as client: - - response = client.embed( - model=model_deployment_name, input=["first phrase", "second phrase", "third phrase"] - ) - - for item in response.data: - length = len(item.embedding) - print( - f"data[{item.index}]: length={length}, [{item.embedding[0]}, {item.embedding[1]}, " - f"..., {item.embedding[length-2]}, {item.embedding[length-1]}]" - ) diff --git a/sdk/ai/azure-ai-projects/setup.py b/sdk/ai/azure-ai-projects/setup.py index bbc27d4b682c..3172316ce47d 100644 --- a/sdk/ai/azure-ai-projects/setup.py +++ b/sdk/ai/azure-ai-projects/setup.py @@ -71,7 +71,7 @@ "azure-core>=1.30.0", "typing-extensions>=4.12.2", "azure-storage-blob>=12.15.0", - "azure-ai-agents>=1.0.0b1", + "azure-ai-agents>=1.0.0", ], python_requires=">=3.9", extras_require={ diff --git a/sdk/ai/azure-ai-projects/tests/conftest.py b/sdk/ai/azure-ai-projects/tests/conftest.py new file mode 100644 index 000000000000..737e4f85c400 --- /dev/null +++ b/sdk/ai/azure-ai-projects/tests/conftest.py @@ -0,0 +1,99 @@ +# pylint: disable=line-too-long,useless-suppression +# ------------------------------------ +# Copyright (c) Microsoft Corporation. +# Licensed under the MIT License. +# ------------------------------------ +import os +import pytest +from dotenv import load_dotenv, find_dotenv +from devtools_testutils import remove_batch_sanitizers, add_general_regex_sanitizer, add_body_key_sanitizer + +if not load_dotenv(find_dotenv(filename="azure_ai_projects_tests.env"), override=True): + print( + "Failed to apply environment variables for azure-ai-projects tests. This is expected if running in ADO pipeline." + ) + + +def pytest_collection_modifyitems(items): + if os.environ.get("AZURE_TEST_RUN_LIVE") == "true": + return + for item in items: + if "tests\\evaluation" in item.fspath.strpath or "tests/evaluation" in item.fspath.strpath: + item.add_marker( + pytest.mark.skip( + reason="Skip running Evaluations tests in PR pipeline until we can sort out the failures related to AI Foundry project settings" + ) + ) + + +class SanitizedValues: + SUBSCRIPTION_ID = "00000000-0000-0000-0000-000000000000" + RESOURCE_GROUP_NAME = "sanitized-resource-group-name" + ACCOUNT_NAME = "sanitized-account-name" + PROJECT_NAME = "sanitized-project-name" + COMPONENT_NAME = "sanitized-component-name" + + +@pytest.fixture(scope="session") +def sanitized_values(): + return { + "subscription_id": f"{SanitizedValues.SUBSCRIPTION_ID}", + "resource_group_name": f"{SanitizedValues.RESOURCE_GROUP_NAME}", + "project_name": f"{SanitizedValues.PROJECT_NAME}", + "account_name": f"{SanitizedValues.ACCOUNT_NAME}", + "component_name": f"{SanitizedValues.COMPONENT_NAME}", + } + + +# From: https://github.com/Azure/azure-sdk-for-python/blob/main/doc/dev/tests.md#start-the-test-proxy-server +# autouse=True will trigger this fixture on each pytest run, even if it's not explicitly used by a test method +# test_proxy auto-starts the test proxy +# patch_sleep and patch_async_sleep streamline tests by disabling wait times during LRO polling +@pytest.fixture(scope="session", autouse=True) +def start_proxy(test_proxy): + return + + +@pytest.fixture(scope="session", autouse=True) +def add_sanitizers(test_proxy, sanitized_values): + + def sanitize_url_paths(): + + add_general_regex_sanitizer( + regex=r"/subscriptions/([-\w\._\(\)]+)", + value=sanitized_values["subscription_id"], + group_for_replace="1", + ) + + add_general_regex_sanitizer( + regex=r"/resource[gG]roups/([-\w\._\(\)]+)", + value=sanitized_values["resource_group_name"], + group_for_replace="1", + ) + + add_general_regex_sanitizer( + regex=r"/projects/([-\w\._\(\)]+)", value=sanitized_values["project_name"], group_for_replace="1" + ) + + add_general_regex_sanitizer( + regex=r"/accounts/([-\w\._\(\)]+)", value=sanitized_values["account_name"], group_for_replace="1" + ) + + add_general_regex_sanitizer( + regex=r"/components/([-\w\._\(\)]+)", value=sanitized_values["component_name"], group_for_replace="1" + ) + + sanitize_url_paths() + + # Sanitize API key from service response (this includes Application Insights connection string) + add_body_key_sanitizer(json_path="credentials.key", value="Sanitized-api-key") + + # Sanitize SAS URI from Datasets get credential response + add_body_key_sanitizer(json_path="blobReference.credential.sasUri", value="Sanitized-sas-uri") + add_body_key_sanitizer(json_path="blobReferenceForConsumption.credential.sasUri", value="Sanitized-sas-uri") + + # Remove the following sanitizers since certain fields are needed in tests and are non-sensitive: + # - AZSDK3493: $..name + # - AZSDK3430: $..id + remove_batch_sanitizers(["AZSDK3493"]) + remove_batch_sanitizers(["AZSDK3430"]) diff --git a/sdk/ai/azure-ai-projects/tests/connections/test_connections.py b/sdk/ai/azure-ai-projects/tests/connections/test_connections.py deleted file mode 100644 index f1e4612563a8..000000000000 --- a/sdk/ai/azure-ai-projects/tests/connections/test_connections.py +++ /dev/null @@ -1,10 +0,0 @@ -# ------------------------------------ -# Copyright (c) Microsoft Corporation. -# Licensed under the MIT License. -# ------------------------------------ - - -class TestConnections: - - def test_connections_get(self, **kwargs): - pass diff --git a/sdk/ai/azure-ai-projects/tests/samples/test_samples.py b/sdk/ai/azure-ai-projects/tests/samples/test_samples.py index 8b12a8ea9c6f..d1d8b69e6228 100644 --- a/sdk/ai/azure-ai-projects/tests/samples/test_samples.py +++ b/sdk/ai/azure-ai-projects/tests/samples/test_samples.py @@ -11,6 +11,9 @@ class TestSamples: + _samples_folder_path: str + _results: dict[str, tuple[bool, str]] + """ Test class for running all samples in the `/sdk/ai/azure-ai-projects/samples` folder. diff --git a/sdk/ai/azure-ai-projects/tests/test_agents.py b/sdk/ai/azure-ai-projects/tests/test_agents.py new file mode 100644 index 000000000000..8e72c929481d --- /dev/null +++ b/sdk/ai/azure-ai-projects/tests/test_agents.py @@ -0,0 +1,45 @@ +# ------------------------------------ +# Copyright (c) Microsoft Corporation. +# Licensed under the MIT License. +# ------------------------------------ + +from azure.ai.projects import AIProjectClient +from test_base import TestBase, servicePreparer +from devtools_testutils import recorded_by_proxy + +# NOTE: This is just a simple test to verify that the agent can be created and deleted using AIProjectClient. +# You can find comprehensive Agent functionally tests here: +# https://github.com/Azure/azure-sdk-for-python/tree/main/sdk/ai/azure-ai-agents/tests + + +class TestAgents(TestBase): + + # To run this test, use the following command in the \sdk\ai\azure-ai-projects folder: + # cls & pytest tests\test_agents.py::TestAgents::test_agents -s + @servicePreparer() + @recorded_by_proxy + def test_agents(self, **kwargs): + + endpoint = kwargs.pop("azure_ai_projects_tests_project_endpoint") + print("\n=====> Endpoint:", endpoint) + + model_deployment_name = self.test_agents_params["model_deployment_name"] + agent_name = self.test_agents_params["agent_name"] + + with AIProjectClient( + endpoint=endpoint, + credential=self.get_credential(AIProjectClient, is_async=False), + ) as project_client: + + print("[test_agents] Create agent") + agent = project_client.agents.create_agent( + model=model_deployment_name, + name=agent_name, + instructions="You are helpful agent", + ) + assert agent.id + assert agent.model == model_deployment_name + assert agent.name == agent_name + + print("[test_agents] Delete agent") + project_client.agents.delete_agent(agent.id) diff --git a/sdk/ai/azure-ai-projects/tests/test_agents_async.py b/sdk/ai/azure-ai-projects/tests/test_agents_async.py new file mode 100644 index 000000000000..76c437c9cd49 --- /dev/null +++ b/sdk/ai/azure-ai-projects/tests/test_agents_async.py @@ -0,0 +1,45 @@ +# ------------------------------------ +# Copyright (c) Microsoft Corporation. +# Licensed under the MIT License. +# ------------------------------------ + +from azure.ai.projects.aio import AIProjectClient +from test_base import TestBase, servicePreparer +from devtools_testutils.aio import recorded_by_proxy_async + +# NOTE: This is just a simple test to verify that the agent can be created and deleted using AIProjectClient. +# You can find comprehensive Agent functionally tests here: +# https://github.com/Azure/azure-sdk-for-python/tree/main/sdk/ai/azure-ai-agents/tests + + +class TestAgentsAsync(TestBase): + + # To run this test, use the following command in the \sdk\ai\azure-ai-projects folder: + # cls & pytest tests\test_agents_async.py::TestAgentsAsync::test_agents -s + @servicePreparer() + @recorded_by_proxy_async + async def test_agents(self, **kwargs): + + endpoint = kwargs.pop("azure_ai_projects_tests_project_endpoint") + print("\n=====> Endpoint:", endpoint) + + model_deployment_name = self.test_agents_params["model_deployment_name"] + agent_name = self.test_agents_params["agent_name"] + + async with AIProjectClient( + endpoint=endpoint, + credential=self.get_credential(AIProjectClient, is_async=True), + ) as project_client: + + print("[test_agents_async] Create agent") + agent = await project_client.agents.create_agent( + model=model_deployment_name, + name=agent_name, + instructions="You are helpful agent", + ) + assert agent.id + assert agent.model == model_deployment_name + assert agent.name == agent_name + + print("[test_agents_async] Delete agent") + await project_client.agents.delete_agent(agent.id) diff --git a/sdk/ai/azure-ai-projects/tests/test_base.py b/sdk/ai/azure-ai-projects/tests/test_base.py new file mode 100644 index 000000000000..ccf63ff4e21c --- /dev/null +++ b/sdk/ai/azure-ai-projects/tests/test_base.py @@ -0,0 +1,192 @@ +# pylint: disable=line-too-long,useless-suppression +# ------------------------------------ +# Copyright (c) Microsoft Corporation. +# Licensed under the MIT License. +# ------------------------------------ +import random +import re +import functools +from typing import Optional +from azure.ai.projects.models import ( + Connection, + ConnectionType, + CredentialType, + ApiKeyCredentials, + Deployment, + DeploymentType, + ModelDeployment, + Index, + IndexType, + AzureAISearchIndex, + DatasetVersion, + DatasetType, + AssetCredentialResponse, +) +from devtools_testutils import AzureRecordedTestCase, EnvironmentVariableLoader, is_live_and_not_recording + + +servicePreparer = functools.partial( + EnvironmentVariableLoader, + "azure_ai_projects_tests", + azure_ai_projects_tests_project_endpoint="https://sanitized.services.ai.azure.com/api/projects/sanitized-project-name", +) + + +class TestBase(AzureRecordedTestCase): + + test_connections_params = { + "connection_name": "connection1", + "connection_type": ConnectionType.AZURE_OPEN_AI, + } + + test_deployments_params = { + "model_publisher": "Cohere", + "model_name": "gpt-4o", + "model_deployment_name": "DeepSeek-V3", + } + + test_agents_params = { + "model_deployment_name": "gpt-4o", + "agent_name": "agent-for-python-projects-sdk-testing", + } + + test_inference_params = { + "connection_name": "connection1", + "model_deployment_name": "gpt-4o", + "aoai_api_version": "2024-10-21", + } + + test_indexes_params = { + "index_name": f"test-index-name", + "index_version": "1", + "ai_search_connection_name": "my-ai-search-connection", + "ai_search_index_name": "my-ai-search-index", + } + + test_datasets_params = { + "dataset_name_1": f"test-dataset-name-{random.randint(0, 99999):05d}", + "dataset_name_2": f"test-dataset-name-{random.randint(0, 99999):05d}", + "dataset_name_3": f"test-dataset-name-{random.randint(0, 99999):05d}", + "dataset_name_4": f"test-dataset-name-{random.randint(0, 99999):05d}", + "dataset_version": 1, + "connection_name": "balapvbyostoragecanary", + } + + # Regular expression describing the pattern of an Application Insights connection string. + REGEX_APPINSIGHTS_CONNECTION_STRING = re.compile( + r"^InstrumentationKey=[0-9a-fA-F-]{36};IngestionEndpoint=https://.+.applicationinsights.azure.com/;LiveEndpoint=https://.+.monitor.azure.com/;ApplicationId=[0-9a-fA-F-]{36}$" + ) + + @staticmethod + def assert_equal_or_not_none(actual, expected=None): + assert actual is not None + if expected is not None: + assert actual == expected + + # Checks that a given dictionary has at least one non-empty (non-whitespace) string key-value pair. + @classmethod + def is_valid_dict(cls, d: dict[str, str]) -> bool: + return bool(d) and all( + isinstance(k, str) and isinstance(v, str) and k.strip() and v.strip() for k, v in d.items() + ) + + @classmethod + def validate_connection( + cls, + connection: Connection, + include_credentials: bool, + *, + expected_connection_type: Optional[ConnectionType] = None, + expected_connection_name: Optional[str] = None, + expected_authentication_type: Optional[CredentialType] = None, + expected_is_default: Optional[bool] = None, + ): + assert connection.id is not None + + TestBase.assert_equal_or_not_none(connection.name, expected_connection_name) + TestBase.assert_equal_or_not_none(connection.type, expected_connection_type) + TestBase.assert_equal_or_not_none(connection.credentials.type, expected_authentication_type) + + if expected_is_default is not None: + assert connection.is_default == expected_is_default + + if include_credentials: + if type(connection.credentials) == ApiKeyCredentials: + assert connection.credentials.type == CredentialType.API_KEY + assert connection.credentials.api_key is not None + + @classmethod + def validate_deployment( + cls, + deployment: Deployment, + *, + expected_model_name: Optional[str] = None, + expected_model_deployment_name: Optional[str] = None, + expected_model_publisher: Optional[str] = None, + ): + assert type(deployment) == ModelDeployment + assert deployment.type == DeploymentType.MODEL_DEPLOYMENT + assert deployment.model_version is not None + # Comment out the below, since I see that `Cohere-embed-v3-english` has an empty capabilities dict. + # assert TestBase.is_valid_dict(deployment.capabilities) + assert bool(deployment.sku) # Check none-empty + + TestBase.assert_equal_or_not_none(deployment.model_name, expected_model_name) + TestBase.assert_equal_or_not_none(deployment.name, expected_model_deployment_name) + TestBase.assert_equal_or_not_none(deployment.model_publisher, expected_model_publisher) + + @classmethod + def validate_index( + cls, + index: Index, + *, + expected_index_type: Optional[IndexType] = None, + expected_index_name: Optional[str] = None, + expected_index_version: Optional[str] = None, + expected_ai_search_connection_name: Optional[str] = None, + expected_ai_search_index_name: Optional[str] = None, + ): + + TestBase.assert_equal_or_not_none(index.name, expected_index_name) + TestBase.assert_equal_or_not_none(index.version, expected_index_version) + + if expected_index_type == IndexType.AZURE_SEARCH: + assert type(index) == AzureAISearchIndex + assert index.type == IndexType.AZURE_SEARCH + TestBase.assert_equal_or_not_none(index.connection_name, expected_ai_search_connection_name) + TestBase.assert_equal_or_not_none(index.index_name, expected_ai_search_index_name) + + @classmethod + def validate_dataset( + cls, + dataset: DatasetVersion, + *, + expected_dataset_type: Optional[DatasetType] = None, + expected_dataset_name: Optional[str] = None, + expected_dataset_version: Optional[str] = None, + expected_connection_name: Optional[str] = None, + ): + assert dataset.data_uri is not None + + if expected_dataset_type: + assert dataset.type == expected_dataset_type + else: + assert dataset.type == DatasetType.URI_FILE or dataset.type == DatasetType.URI_FOLDER + + TestBase.assert_equal_or_not_none(dataset.name, expected_dataset_name) + TestBase.assert_equal_or_not_none(dataset.version, expected_dataset_version) + if expected_connection_name: + assert dataset.connection_name == expected_connection_name + + @classmethod + def validate_asset_credential(cls, asset_credential: AssetCredentialResponse): + + assert asset_credential.blob_reference is not None + assert asset_credential.blob_reference.blob_uri + assert asset_credential.blob_reference.storage_account_arm_id + + assert asset_credential.blob_reference.credential is not None + assert ( + asset_credential.blob_reference.credential.type == "SAS" + ) # Why is this not of type CredentialType.SAS as defined for Connections? + assert asset_credential.blob_reference.credential.sas_uri diff --git a/sdk/ai/azure-ai-projects/tests/test_connections.py b/sdk/ai/azure-ai-projects/tests/test_connections.py new file mode 100644 index 000000000000..55db3a70288a --- /dev/null +++ b/sdk/ai/azure-ai-projects/tests/test_connections.py @@ -0,0 +1,64 @@ +# ------------------------------------ +# Copyright (c) Microsoft Corporation. +# Licensed under the MIT License. +# ------------------------------------ + +from azure.ai.projects import AIProjectClient +from test_base import TestBase, servicePreparer +from devtools_testutils import recorded_by_proxy + + +class TestConnections(TestBase): + + # To run this test, use the following command in the \sdk\ai\azure-ai-projects folder: + # cls & pytest tests\test_connections.py::TestConnections::test_connections -s + @servicePreparer() + @recorded_by_proxy + def test_connections(self, **kwargs): + + endpoint = kwargs.pop("azure_ai_projects_tests_project_endpoint") + print("\n=====> Endpoint:", endpoint) + + connection_name = self.test_connections_params["connection_name"] + connection_type = self.test_connections_params["connection_type"] + + with AIProjectClient( + endpoint=endpoint, + credential=self.get_credential(AIProjectClient, is_async=False), + ) as project_client: + + print("[test_connections] List all connections") + empty = True + for connection in project_client.connections.list(): + empty = False + TestBase.validate_connection(connection, False) + assert not empty + + print("[test_connections] List all connections of a particular type") + empty = True + for connection in project_client.connections.list( + connection_type=connection_type, + ): + empty = False + TestBase.validate_connection(connection, False, expected_connection_type=connection_type) + assert not empty + + print("[test_connections] Get the default connection of a particular type, without its credentials") + connection = project_client.connections.get_default(connection_type=connection_type) + TestBase.validate_connection(connection, False, expected_connection_type=connection_type) + + print("[test_connections] Get the default connection of a particular type, with its credentials") + connection = project_client.connections.get_default( + connection_type=connection_type, include_credentials=True + ) + TestBase.validate_connection( + connection, True, expected_connection_type=connection_type, expected_is_default=True + ) + + print(f"[test_connections] Get the connection named `{connection_name}`, without its credentials") + connection = project_client.connections.get(connection_name) + TestBase.validate_connection(connection, False, expected_connection_name=connection_name) + + print(f"[test_connections] Get the connection named `{connection_name}`, with its credentials") + connection = project_client.connections.get(connection_name, include_credentials=True) + TestBase.validate_connection(connection, True, expected_connection_name=connection_name) diff --git a/sdk/ai/azure-ai-projects/tests/test_connections_async.py b/sdk/ai/azure-ai-projects/tests/test_connections_async.py new file mode 100644 index 000000000000..147bad39de9b --- /dev/null +++ b/sdk/ai/azure-ai-projects/tests/test_connections_async.py @@ -0,0 +1,64 @@ +# ------------------------------------ +# Copyright (c) Microsoft Corporation. +# Licensed under the MIT License. +# ------------------------------------ + +from azure.ai.projects.aio import AIProjectClient +from test_base import TestBase, servicePreparer +from devtools_testutils.aio import recorded_by_proxy_async + + +class TestConnectionsAsync(TestBase): + + # To run this test, use the following command in the \sdk\ai\azure-ai-projects folder: + # cls & pytest tests\test_connections_async.py::TestConnectionsAsync::test_connections_async -s + @servicePreparer() + @recorded_by_proxy_async + async def test_connections_async(self, **kwargs): + + endpoint = kwargs.pop("azure_ai_projects_tests_project_endpoint") + print("\n=====> Endpoint:", endpoint) + + connection_name = self.test_connections_params["connection_name"] + connection_type = self.test_connections_params["connection_type"] + + async with AIProjectClient( + endpoint=endpoint, + credential=self.get_credential(AIProjectClient, is_async=True), + ) as project_client: + + print("[test_connections_async] List all connections") + empty = True + async for connection in project_client.connections.list(): + empty = False + TestBase.validate_connection(connection, False) + assert not empty + + print("[test_connections_async] List all connections of a particular type") + empty = True + async for connection in project_client.connections.list( + connection_type=connection_type, + ): + empty = False + TestBase.validate_connection(connection, False, expected_connection_type=connection_type) + assert not empty + + print("[test_connections_async] Get the default connection of a particular type, without its credentials") + connection = await project_client.connections.get_default(connection_type=connection_type) + TestBase.validate_connection(connection, False, expected_connection_type=connection_type) + + print("[test_connections_async] Get the default connection of a particular type, with its credentials") + connection = await project_client.connections.get_default( + connection_type=connection_type, include_credentials=True + ) + TestBase.validate_connection( + connection, True, expected_connection_type=connection_type, expected_is_default=True + ) + + print(f"[test_connections_async] Get the connection named `{connection_name}`, without its credentials") + connection = await project_client.connections.get(connection_name) + TestBase.validate_connection(connection, False, expected_connection_name=connection_name) + + print(f"[test_connections_async] Get the connection named `{connection_name}`, with its credentials") + connection = await project_client.connections.get(connection_name, include_credentials=True) + TestBase.validate_connection(connection, True, expected_connection_name=connection_name) diff --git a/sdk/ai/azure-ai-projects/tests/test_data/datasets/data_file1.txt b/sdk/ai/azure-ai-projects/tests/test_data/datasets/data_file1.txt new file mode 100644 index 000000000000..e129759a15ff --- /dev/null +++ b/sdk/ai/azure-ai-projects/tests/test_data/datasets/data_file1.txt @@ -0,0 +1 @@ +This is sample file 1 diff --git a/sdk/ai/azure-ai-projects/tests/test_data/datasets/data_file2.txt b/sdk/ai/azure-ai-projects/tests/test_data/datasets/data_file2.txt new file mode 100644 index 000000000000..3dd74cdfc9eb --- /dev/null +++ b/sdk/ai/azure-ai-projects/tests/test_data/datasets/data_file2.txt @@ -0,0 +1 @@ +This is sample file 2 diff --git a/sdk/ai/azure-ai-projects/tests/test_data/datasets/data_subfolder/data_file3.txt b/sdk/ai/azure-ai-projects/tests/test_data/datasets/data_subfolder/data_file3.txt new file mode 100644 index 000000000000..dde35c02f5a4 --- /dev/null +++ b/sdk/ai/azure-ai-projects/tests/test_data/datasets/data_subfolder/data_file3.txt @@ -0,0 +1 @@ +This is sample file 3 diff --git a/sdk/ai/azure-ai-projects/tests/test_data/datasets/data_subfolder/data_file4.txt b/sdk/ai/azure-ai-projects/tests/test_data/datasets/data_subfolder/data_file4.txt new file mode 100644 index 000000000000..0d17a14a0c1f --- /dev/null +++ b/sdk/ai/azure-ai-projects/tests/test_data/datasets/data_subfolder/data_file4.txt @@ -0,0 +1 @@ +This is sample file 4 diff --git a/sdk/ai/azure-ai-projects/tests/test_datasets.py b/sdk/ai/azure-ai-projects/tests/test_datasets.py new file mode 100644 index 000000000000..796ae52c8289 --- /dev/null +++ b/sdk/ai/azure-ai-projects/tests/test_datasets.py @@ -0,0 +1,195 @@ +# pylint: disable=line-too-long,useless-suppression +# ------------------------------------ +# Copyright (c) Microsoft Corporation. +# Licensed under the MIT License. +# ------------------------------------ +import os +import re +import pytest +from azure.ai.projects import AIProjectClient +from azure.ai.projects.models import DatasetVersion, DatasetType +from test_base import TestBase, servicePreparer +from devtools_testutils import recorded_by_proxy, is_live_and_not_recording +from azure.core.exceptions import HttpResponseError + + +# Construct the paths to the data folder and data file used in this test +script_dir = os.path.dirname(os.path.abspath(__file__)) +data_folder = os.environ.get("DATA_FOLDER", os.path.join(script_dir, "test_data/datasets")) +data_file1 = os.path.join(data_folder, "data_file1.txt") +data_file2 = os.path.join(data_folder, "data_file2.txt") + + +class TestDatasets(TestBase): + + # To run this test, use the following command in the \sdk\ai\azure-ai-projects folder: + # cls & pytest tests\test_datasets.py::TestDatasets::test_datasets_upload_file -s + @servicePreparer() + @pytest.mark.skipif( + not is_live_and_not_recording(), + reason="Skipped because this test involves network calls from another client (azure.storage.blob) that is not recorded.", + ) + @recorded_by_proxy + def test_datasets_upload_file(self, **kwargs): + + endpoint = kwargs.pop("azure_ai_projects_tests_project_endpoint") + print("\n=====> Endpoint:", endpoint) + + connection_name = self.test_datasets_params["connection_name"] + dataset_name = self.test_datasets_params["dataset_name_1"] + dataset_version = self.test_datasets_params["dataset_version"] + + with AIProjectClient( + endpoint=endpoint, + credential=self.get_credential(AIProjectClient, is_async=False), + ) as project_client: + + print( + f"[test_datasets_upload_file] Upload a single file and create a new Dataset `{dataset_name}`, version `{dataset_version}`, to reference the file." + ) + dataset: DatasetVersion = project_client.datasets.upload_file( + name=dataset_name, + version=str(dataset_version), + file_path=data_file1, + connection_name=connection_name, + ) + print(dataset) + TestBase.validate_dataset( + dataset, + expected_dataset_type=DatasetType.URI_FILE, + expected_dataset_name=dataset_name, + expected_dataset_version=str(dataset_version), + ) + + print(f"[test_datasets_upload_file] Get an existing Dataset version `{dataset_version}`:") + dataset = project_client.datasets.get(name=dataset_name, version=dataset_version) + print(dataset) + TestBase.validate_dataset( + dataset, + expected_dataset_type=DatasetType.URI_FILE, + expected_dataset_name=dataset_name, + expected_dataset_version=str(dataset_version), + ) + + print( + f"[test_datasets_upload_file] Upload a single file and create a new version in existing Dataset `{dataset_name}`, to reference the file." + ) + dataset: DatasetVersion = project_client.datasets.upload_file( + name=dataset_name, + version=str(dataset_version + 1), + file_path=data_file2, + connection_name=connection_name, + ) + print(dataset) + TestBase.validate_dataset( + dataset, + expected_dataset_type=DatasetType.URI_FILE, + expected_dataset_name=dataset_name, + expected_dataset_version=str(dataset_version + 1), + ) + + print(f"[test_datasets_upload_file] Get credentials of an existing Dataset version `{dataset_version}`:") + asset_credential = project_client.datasets.get_credentials(name=dataset_name, version=str(dataset_version)) + print(asset_credential) + TestBase.validate_asset_credential(asset_credential) + + """ + print("[test_datasets_upload_file] List latest versions of all Datasets:") + empty = True + for dataset in project_client.datasets.list(): + empty = False + print(dataset) + TestBase.validate_dataset(dataset) + assert not empty + + print(f"[test_datasets_upload_file] Listing all versions of the Dataset named `{dataset_name}`:") + empty = True + for dataset in project_client.datasets.list_versions(name=dataset_name): + empty = False + print(dataset) + TestBase.validate_dataset(dataset, expected_dataset_name=dataset_name) + assert not empty + """ + + print( + f"[test_datasets_upload_file] Delete Dataset `{dataset_name}`, version `{dataset_version}` that was created above." + ) + project_client.datasets.delete(name=dataset_name, version=str(dataset_version)) + project_client.datasets.delete(name=dataset_name, version=str(dataset_version + 1)) + + print( + "[test_datasets_upload_file] Delete the same (now non-existing) Dataset. REST API call should return 204 (No content). This call should NOT throw an exception." + ) + project_client.datasets.delete(name=dataset_name, version=str(dataset_version)) + + print( + f"[test_datasets_upload_file] Try to get a non-existing Dataset `{dataset_name}`, version `{dataset_version}`. This should throw an exception." + ) + try: + exception_thrown = False + dataset = project_client.datasets.get(name=dataset_name, version=str(dataset_version)) + except HttpResponseError as e: + exception_thrown = True + print(f"Expected exception occurred: {e}") + assert "Could not find asset with ID" in e.message + assert exception_thrown + + # To run this test, use the following command in the \sdk\ai\azure-ai-projects folder: + # cls & pytest tests\test_datasets.py::TestDatasets::test_datasets_upload_folder -s + @servicePreparer() + @pytest.mark.skipif( + not is_live_and_not_recording(), + reason="Skipped because this test involves network calls from another client (azure.storage.blob) that is not recorded.", + ) + @recorded_by_proxy + def test_datasets_upload_folder(self, **kwargs): + + endpoint = kwargs.pop("azure_ai_projects_tests_project_endpoint") + print("\n=====> Endpoint:", endpoint) + + connection_name = self.test_datasets_params["connection_name"] + dataset_name = self.test_datasets_params["dataset_name_2"] + dataset_version = self.test_datasets_params["dataset_version"] + + with AIProjectClient( + endpoint=endpoint, + credential=self.get_credential(AIProjectClient, is_async=False), + ) as project_client: + + print( + f"[test_datasets_upload_folder] Upload files in a folder (including sub-folders) and create a new version `{dataset_version}` in the same Dataset, to reference the files." + ) + dataset = project_client.datasets.upload_folder( + name=dataset_name, + version=str(dataset_version), + folder=data_folder, + connection_name=connection_name, + file_pattern=re.compile(r"\.(txt|csv|md)$", re.IGNORECASE), + ) + print(dataset) + TestBase.validate_dataset( + dataset, + expected_dataset_type=DatasetType.URI_FOLDER, + expected_dataset_name=dataset_name, + expected_dataset_version=str(dataset_version), + ) + + print(f"[test_datasets_upload_file] Get an existing Dataset version `{dataset_version}`:") + dataset = project_client.datasets.get(name=dataset_name, version=str(dataset_version)) + print(dataset) + TestBase.validate_dataset( + dataset, + expected_dataset_type=DatasetType.URI_FOLDER, + expected_dataset_name=dataset_name, + expected_dataset_version=str(dataset_version), + ) + + print(f"[test_datasets_upload_file] Get credentials of an existing Dataset version `{dataset_version}`:") + asset_credential = project_client.datasets.get_credentials(name=dataset_name, version=str(dataset_version)) + print(asset_credential) + TestBase.validate_asset_credential(asset_credential) + + print( + f"[test_datasets_upload_file] Delete Dataset `{dataset_name}`, version `{dataset_version}` that was created above." + ) + project_client.datasets.delete(name=dataset_name, version=str(dataset_version)) diff --git a/sdk/ai/azure-ai-projects/tests/test_datasets_async.py b/sdk/ai/azure-ai-projects/tests/test_datasets_async.py new file mode 100644 index 000000000000..d1df117dd914 --- /dev/null +++ b/sdk/ai/azure-ai-projects/tests/test_datasets_async.py @@ -0,0 +1,200 @@ +# pylint: disable=line-too-long,useless-suppression +# ------------------------------------ +# Copyright (c) Microsoft Corporation. +# Licensed under the MIT License. +# ------------------------------------ +import os +import re +import pytest +from azure.ai.projects.aio import AIProjectClient +from azure.ai.projects.models import DatasetVersion, DatasetType +from test_base import TestBase, servicePreparer +from devtools_testutils.aio import recorded_by_proxy_async +from devtools_testutils import is_live_and_not_recording +from azure.core.exceptions import HttpResponseError + + +# Construct the paths to the data folder and data file used in this test +script_dir = os.path.dirname(os.path.abspath(__file__)) +data_folder = os.environ.get("DATA_FOLDER", os.path.join(script_dir, "test_data/datasets")) +data_file1 = os.path.join(data_folder, "data_file1.txt") +data_file2 = os.path.join(data_folder, "data_file2.txt") + + +class TestDatasetsAsync(TestBase): + + # To run this test, use the following command in the \sdk\ai\azure-ai-projects folder: + # cls & pytest tests\test_datasets_async.py::TestDatasetsAsync::test_datasets_upload_file_async -s + @servicePreparer() + @pytest.mark.skipif( + not is_live_and_not_recording(), + reason="Skipped because this test involves network calls from another client (azure.storage.blob) that is not recorded.", + ) + @recorded_by_proxy_async + async def test_datasets_upload_file(self, **kwargs): + + endpoint = kwargs.pop("azure_ai_projects_tests_project_endpoint") + print("\n=====> Endpoint:", endpoint) + + connection_name = self.test_datasets_params["connection_name"] + dataset_name = self.test_datasets_params["dataset_name_3"] + dataset_version = self.test_datasets_params["dataset_version"] + + async with AIProjectClient( + endpoint=endpoint, + credential=self.get_credential(AIProjectClient, is_async=True), + ) as project_client: + + print( + f"[test_datasets_upload_file] Upload a single file and create a new Dataset `{dataset_name}`, version `{dataset_version}`, to reference the file." + ) + dataset: DatasetVersion = await project_client.datasets.upload_file( + name=dataset_name, + version=str(dataset_version), + file_path=data_file1, + connection_name=connection_name, + ) + print(dataset) + TestBase.validate_dataset( + dataset, + expected_dataset_type=DatasetType.URI_FILE, + expected_dataset_name=dataset_name, + expected_dataset_version=str(dataset_version), + ) + + print(f"[test_datasets_upload_file] Get an existing Dataset version `{dataset_version}`:") + dataset = await project_client.datasets.get(name=dataset_name, version=dataset_version) + print(dataset) + TestBase.validate_dataset( + dataset, + expected_dataset_type=DatasetType.URI_FILE, + expected_dataset_name=dataset_name, + expected_dataset_version=str(dataset_version), + ) + + print( + f"[test_datasets_upload_file] Upload a single file and create a new version in existing Dataset `{dataset_name}`, to reference the file." + ) + dataset: DatasetVersion = await project_client.datasets.upload_file( + name=dataset_name, + version=str(dataset_version + 1), + file_path=data_file2, + connection_name=connection_name, + ) + print(dataset) + TestBase.validate_dataset( + dataset, + expected_dataset_type=DatasetType.URI_FILE, + expected_dataset_name=dataset_name, + expected_dataset_version=str(dataset_version + 1), + ) + + print(f"[test_datasets_upload_file] Get credentials of an existing Dataset version `{dataset_version}`:") + asset_credential = await project_client.datasets.get_credentials( + name=dataset_name, version=str(dataset_version) + ) + print(asset_credential) + TestBase.validate_asset_credential(asset_credential) + + """ + print("[test_datasets_upload_file] List latest versions of all Datasets:") + empty = True + for dataset in project_client.datasets.list(): + empty = False + print(dataset) + TestBase.validate_dataset(dataset) + assert not empty + + print(f"[test_datasets_upload_file] Listing all versions of the Dataset named `{dataset_name}`:") + empty = True + for dataset in project_client.datasets.list_versions(name=dataset_name): + empty = False + print(dataset) + TestBase.validate_dataset(dataset, expected_dataset_name=dataset_name) + assert not empty + """ + + print( + f"[test_datasets_upload_file] Delete Dataset `{dataset_name}`, version `{dataset_version}` that was created above." + ) + await project_client.datasets.delete(name=dataset_name, version=str(dataset_version)) + await project_client.datasets.delete(name=dataset_name, version=str(dataset_version + 1)) + + print( + "[test_datasets_upload_file] Delete the same (now non-existing) Dataset. REST API call should return 204 (No content). This call should NOT throw an exception." + ) + await project_client.datasets.delete(name=dataset_name, version=str(dataset_version)) + + print( + f"[test_datasets_upload_file] Try to get a non-existing Dataset `{dataset_name}`, version `{dataset_version}`. This should throw an exception." + ) + try: + exception_thrown = False + dataset = await project_client.datasets.get(name=dataset_name, version=str(dataset_version)) + except HttpResponseError as e: + exception_thrown = True + print(f"Expected exception occurred: {e}") + assert "Could not find asset with ID" in e.message + assert exception_thrown + + # To run this test, use the following command in the \sdk\ai\azure-ai-projects folder: + # cls & pytest tests\test_datasets_async.py::TestDatasetsAsync::test_datasets_upload_folder_async -s + @servicePreparer() + @pytest.mark.skipif( + not is_live_and_not_recording(), + reason="Skipped because this test involves network calls from another client (azure.storage.blob) that is not recorded.", + ) + @recorded_by_proxy_async + async def test_datasets_upload_folder_async(self, **kwargs): + + endpoint = kwargs.pop("azure_ai_projects_tests_project_endpoint") + print("\n=====> Endpoint:", endpoint) + + connection_name = self.test_datasets_params["connection_name"] + dataset_name = self.test_datasets_params["dataset_name_4"] + dataset_version = self.test_datasets_params["dataset_version"] + + async with AIProjectClient( + endpoint=endpoint, + credential=self.get_credential(AIProjectClient, is_async=True), + ) as project_client: + + print( + f"[test_datasets_upload_folder] Upload files in a folder (including sub-folders) and create a new version `{dataset_version}` in the same Dataset, to reference the files." + ) + dataset = await project_client.datasets.upload_folder( + name=dataset_name, + version=str(dataset_version), + folder=data_folder, + connection_name=connection_name, + file_pattern=re.compile(r"\.(txt|csv|md)$", re.IGNORECASE), + ) + print(dataset) + TestBase.validate_dataset( + dataset, + expected_dataset_type=DatasetType.URI_FOLDER, + expected_dataset_name=dataset_name, + expected_dataset_version=str(dataset_version), + ) + + print(f"[test_datasets_upload_file] Get an existing Dataset version `{dataset_version}`:") + dataset = await project_client.datasets.get(name=dataset_name, version=str(dataset_version)) + print(dataset) + TestBase.validate_dataset( + dataset, + expected_dataset_type=DatasetType.URI_FOLDER, + expected_dataset_name=dataset_name, + expected_dataset_version=str(dataset_version), + ) + + print(f"[test_datasets_upload_file] Get credentials of an existing Dataset version `{dataset_version}`:") + asset_credential = await project_client.datasets.get_credentials( + name=dataset_name, version=str(dataset_version) + ) + print(asset_credential) + TestBase.validate_asset_credential(asset_credential) + + print( + f"[test_datasets_upload_file] Delete Dataset `{dataset_name}`, version `{dataset_version}` that was created above." + ) + await project_client.datasets.delete(name=dataset_name, version=str(dataset_version)) diff --git a/sdk/ai/azure-ai-projects/tests/test_deployments.py b/sdk/ai/azure-ai-projects/tests/test_deployments.py new file mode 100644 index 000000000000..805f27e76d2b --- /dev/null +++ b/sdk/ai/azure-ai-projects/tests/test_deployments.py @@ -0,0 +1,54 @@ +# ------------------------------------ +# Copyright (c) Microsoft Corporation. +# Licensed under the MIT License. +# ------------------------------------ + +from azure.ai.projects import AIProjectClient +from test_base import TestBase, servicePreparer +from devtools_testutils import recorded_by_proxy + + +class TestDeployments(TestBase): + + # To run this test, use the following command in the \sdk\ai\azure-ai-projects folder: + # cls & pytest tests\test_deployments.py::TestDeployments::test_deployments -s + @servicePreparer() + @recorded_by_proxy + def test_deployments(self, **kwargs): + + endpoint = kwargs.pop("azure_ai_projects_tests_project_endpoint") + print("\n=====> Endpoint:", endpoint) + + model_publisher = self.test_deployments_params["model_publisher"] + model_name = self.test_deployments_params["model_name"] + model_deployment_name = self.test_deployments_params["model_deployment_name"] + + with AIProjectClient( + endpoint=endpoint, + credential=self.get_credential(AIProjectClient, is_async=False), + ) as project_client: + + print("[test_deployments] List all deployments") + empty = True + for deployment in project_client.deployments.list(): + empty = False + TestBase.validate_deployment(deployment) + assert not empty + + print(f"[test_deployments] List all deployments by the model publisher `{model_publisher}`") + empty = True + for deployment in project_client.deployments.list(model_publisher=model_publisher): + empty = False + TestBase.validate_deployment(deployment, expected_model_publisher=model_publisher) + assert not empty + + print(f"[test_deployments] List all deployments of model `{model_name}`") + empty = True + for deployment in project_client.deployments.list(model_name=model_name): + empty = False + TestBase.validate_deployment(deployment, expected_model_name=model_name) + assert not empty + + print(f"[test_deployments] Get a single deployment named `{model_deployment_name}`") + deployment = project_client.deployments.get(model_deployment_name) + TestBase.validate_deployment(deployment, expected_model_deployment_name=model_deployment_name) diff --git a/sdk/ai/azure-ai-projects/tests/test_deployments_async.py b/sdk/ai/azure-ai-projects/tests/test_deployments_async.py new file mode 100644 index 000000000000..493d71935993 --- /dev/null +++ b/sdk/ai/azure-ai-projects/tests/test_deployments_async.py @@ -0,0 +1,54 @@ +# ------------------------------------ +# Copyright (c) Microsoft Corporation. +# Licensed under the MIT License. +# ------------------------------------ + +from azure.ai.projects.aio import AIProjectClient +from test_base import TestBase, servicePreparer +from devtools_testutils.aio import recorded_by_proxy_async + + +class TestDeploymentsAsync(TestBase): + + # To run this test, use the following command in the \sdk\ai\azure-ai-projects folder: + # cls & pytest tests\test_deployments_async.py::TestDeploymentsAsync::test_deployments_async -s + @servicePreparer() + @recorded_by_proxy_async + async def test_deployments_async(self, **kwargs): + + endpoint = kwargs.pop("azure_ai_projects_tests_project_endpoint") + print("\n=====> Endpoint:", endpoint) + + model_publisher = self.test_deployments_params["model_publisher"] + model_name = self.test_deployments_params["model_name"] + model_deployment_name = self.test_deployments_params["model_deployment_name"] + + async with AIProjectClient( + endpoint=endpoint, + credential=self.get_credential(AIProjectClient, is_async=True), + ) as project_client: + + print("[test_deployments_async] List all deployments") + empty = True + async for deployment in project_client.deployments.list(): + empty = False + TestBase.validate_deployment(deployment) + assert not empty + + print(f"[test_deployments_async] List all deployments by the model publisher `{model_publisher}`") + empty = True + async for deployment in project_client.deployments.list(model_publisher=model_publisher): + empty = False + TestBase.validate_deployment(deployment, expected_model_publisher=model_publisher) + assert not empty + + print(f"[test_deployments_async] List all deployments of model `{model_name}`") + empty = True + async for deployment in project_client.deployments.list(model_name=model_name): + empty = False + TestBase.validate_deployment(deployment, expected_model_name=model_name) + assert not empty + + print(f"[test_deployments_async] Get a single deployment named `{model_deployment_name}`") + deployment = await project_client.deployments.get(model_deployment_name) + TestBase.validate_deployment(deployment, expected_model_deployment_name=model_deployment_name) diff --git a/sdk/ai/azure-ai-projects/tests/test_indexes.py b/sdk/ai/azure-ai-projects/tests/test_indexes.py new file mode 100644 index 000000000000..d6dd6969e780 --- /dev/null +++ b/sdk/ai/azure-ai-projects/tests/test_indexes.py @@ -0,0 +1,86 @@ +# pylint: disable=line-too-long,useless-suppression +# ------------------------------------ +# Copyright (c) Microsoft Corporation. +# Licensed under the MIT License. +# ------------------------------------ + +from azure.ai.projects import AIProjectClient +from azure.ai.projects.models import AzureAISearchIndex, IndexType +from test_base import TestBase, servicePreparer +from devtools_testutils import recorded_by_proxy + + +class TestIndexes(TestBase): + + # To run this test, use the following command in the \sdk\ai\azure-ai-projects folder: + # cls & pytest tests\test_indexes.py::TestIndexes::test_indexes -s + @servicePreparer() + @recorded_by_proxy + def test_indexes(self, **kwargs): + + endpoint = kwargs.pop("azure_ai_projects_tests_project_endpoint") + print("\n=====> Endpoint:", endpoint) + + index_name = self.test_indexes_params["index_name"] + index_version = self.test_indexes_params["index_version"] + ai_search_connection_name = self.test_indexes_params["ai_search_connection_name"] + ai_search_index_name = self.test_indexes_params["ai_search_index_name"] + + with AIProjectClient( + endpoint=endpoint, + credential=self.get_credential(AIProjectClient, is_async=False), + ) as project_client: + + print( + f"[test_indexes] Create Index `{index_name}` with version `{index_version}`, referencing an existing AI Search resource:" + ) + index = project_client.indexes.create_or_update( + name=index_name, + version=index_version, + index=AzureAISearchIndex(connection_name=ai_search_connection_name, index_name=ai_search_index_name), + ) + print(index) + TestBase.validate_index( + index, + expected_index_type=IndexType.AZURE_SEARCH, + expected_index_name=index_name, + expected_index_version=index_version, + expected_ai_search_connection_name=ai_search_connection_name, + expected_ai_search_index_name=ai_search_index_name, + ) + + print(f"[test_indexes] Get Index `{index_name}` version `{index_version}`:") + index = project_client.indexes.get(name=index_name, version=index_version) + print(index) + TestBase.validate_index( + index, + expected_index_type=IndexType.AZURE_SEARCH, + expected_index_name=index_name, + expected_index_version=index_version, + expected_ai_search_connection_name=ai_search_connection_name, + expected_ai_search_index_name=ai_search_index_name, + ) + + print("[test_indexes] List latest versions of all Indexes:") + empty = True + for index in project_client.indexes.list(): + empty = False + print(index) + TestBase.validate_index(index) + assert not empty + + print(f"[test_indexes] Listing all versions of the Index named `{index_name}`:") + empty = True + for index in project_client.indexes.list_versions(name=index_name): + empty = False + print(index) + TestBase.validate_index(index) + assert not empty + + print(f"[test_indexes] Delete Index `{index_name}` version `{index_version}`.") + project_client.indexes.delete(name=index_name, version=index_version) + + print( + f"[test_indexes] Again delete Index `{index_name}` version `{index_version}`. Since it does not exist, the REST API should return 204 (No content). This call should NOT throw an exception." + ) + project_client.indexes.delete(name=index_name, version=index_version) diff --git a/sdk/ai/azure-ai-projects/tests/test_indexes_async.py b/sdk/ai/azure-ai-projects/tests/test_indexes_async.py new file mode 100644 index 000000000000..59c3e430a00b --- /dev/null +++ b/sdk/ai/azure-ai-projects/tests/test_indexes_async.py @@ -0,0 +1,86 @@ +# pylint: disable=line-too-long,useless-suppression +# ------------------------------------ +# Copyright (c) Microsoft Corporation. +# Licensed under the MIT License. +# ------------------------------------ + +from azure.ai.projects.aio import AIProjectClient +from azure.ai.projects.models import AzureAISearchIndex, IndexType +from test_base import TestBase, servicePreparer +from devtools_testutils.aio import recorded_by_proxy_async + + +class TestIndexesAsync(TestBase): + + # To run this test, use the following command in the \sdk\ai\azure-ai-projects folder: + # cls & pytest tests\test_indexes_async.py::TestIndexesAsync::test_indexes_async -s + @servicePreparer() + @recorded_by_proxy_async + async def test_indexes_async(self, **kwargs): + + endpoint = kwargs.pop("azure_ai_projects_tests_project_endpoint") + print("\n=====> Endpoint:", endpoint) + + index_name = self.test_indexes_params["index_name"] + index_version = self.test_indexes_params["index_version"] + ai_search_connection_name = self.test_indexes_params["ai_search_connection_name"] + ai_search_index_name = self.test_indexes_params["ai_search_index_name"] + + async with AIProjectClient( + endpoint=endpoint, + credential=self.get_credential(AIProjectClient, is_async=True), + ) as project_client: + + print( + f"[test_indexes] Create Index `{index_name}` with version `{index_version}`, referencing an existing AI Search resource:" + ) + index = await project_client.indexes.create_or_update( + name=index_name, + version=index_version, + index=AzureAISearchIndex(connection_name=ai_search_connection_name, index_name=ai_search_index_name), + ) + print(index) + TestBase.validate_index( + index, + expected_index_type=IndexType.AZURE_SEARCH, + expected_index_name=index_name, + expected_index_version=index_version, + expected_ai_search_connection_name=ai_search_connection_name, + expected_ai_search_index_name=ai_search_index_name, + ) + + print(f"[test_indexes] Get Index `{index_name}` version `{index_version}`:") + index = await project_client.indexes.get(name=index_name, version=index_version) + print(index) + TestBase.validate_index( + index, + expected_index_type=IndexType.AZURE_SEARCH, + expected_index_name=index_name, + expected_index_version=index_version, + expected_ai_search_connection_name=ai_search_connection_name, + expected_ai_search_index_name=ai_search_index_name, + ) + + print("[test_indexes] List latest versions of all Indexes:") + empty = True + async for index in project_client.indexes.list(): + empty = False + print(index) + TestBase.validate_index(index) + assert not empty + + print(f"[test_indexes] Listing all versions of the Index named `{index_name}`:") + empty = True + async for index in project_client.indexes.list_versions(name=index_name): + empty = False + print(index) + TestBase.validate_index(index) + assert not empty + + print(f"[test_indexes] Delete Index `{index_name}` version `{index_version}`.") + await project_client.indexes.delete(name=index_name, version=index_version) + + print( + f"[test_indexes] Again delete Index `{index_name}` version `{index_version}`. Since it does not exist, the REST API should return 204 (No content). This call should NOT throw an exception." + ) + await project_client.indexes.delete(name=index_name, version=index_version) diff --git a/sdk/ai/azure-ai-projects/tests/test_inference.py b/sdk/ai/azure-ai-projects/tests/test_inference.py new file mode 100644 index 000000000000..c7809977271e --- /dev/null +++ b/sdk/ai/azure-ai-projects/tests/test_inference.py @@ -0,0 +1,101 @@ +# pylint: disable=line-too-long,useless-suppression +# ------------------------------------ +# Copyright (c) Microsoft Corporation. +# Licensed under the MIT License. +# ------------------------------------ +import pprint + +import pytest +from azure.ai.projects import AIProjectClient +from test_base import TestBase, servicePreparer +from devtools_testutils import recorded_by_proxy, is_live_and_not_recording + + +class TestInference(TestBase): + + # To run this test, use the following command in the \sdk\ai\azure-ai-projects folder: + # cls & pytest tests\test_inference.py::TestInference::test_inference -s + @servicePreparer() + @pytest.mark.skipif( + condition=(not is_live_and_not_recording()), + reason="Skipped because we cannot record chat completions call with AOAI client", + ) + @recorded_by_proxy + def test_inference(self, **kwargs): + + endpoint = kwargs.pop("azure_ai_projects_tests_project_endpoint") + print("\n=====> Endpoint:", endpoint) + + model_deployment_name = self.test_inference_params["model_deployment_name"] + api_version = self.test_inference_params["aoai_api_version"] + + with AIProjectClient( + endpoint=endpoint, + credential=self.get_credential(AIProjectClient, is_async=False), + ) as project_client: + + print( + "[test_inference] Get an authenticated Azure OpenAI client for the parent AI Services resource, and perform a chat completion operation." + ) + with project_client.inference.get_azure_openai_client(api_version=api_version) as client: + + response = client.chat.completions.create( + model=model_deployment_name, + messages=[ + { + "role": "user", + "content": "How many feet are in a mile?", + }, + ], + ) + + print("Raw dump of response object: ") + pprint.pprint(response) + print("Response message: ", response.choices[0].message.content) + contains = ["5280", "5,280"] + assert any(item in response.choices[0].message.content for item in contains) + + # To run this test, use the following command in the \sdk\ai\azure-ai-projects folder: + # cls & pytest tests\test_inference.py::TestInference::test_inference_on_connection -s + @servicePreparer() + @pytest.mark.skipif( + condition=(not is_live_and_not_recording()), + reason="Skipped because we cannot record chat completions call with AOAI client", + ) + @recorded_by_proxy + def test_inference_on_connection(self, **kwargs): + + endpoint = kwargs.pop("azure_ai_projects_tests_project_endpoint") + print("\n=====> Endpoint:", endpoint) + + connection_name = self.test_inference_params["connection_name"] + model_deployment_name = self.test_inference_params["model_deployment_name"] + api_version = self.test_inference_params["aoai_api_version"] + + with AIProjectClient( + endpoint=endpoint, + credential=self.get_credential(AIProjectClient, is_async=False), + ) as project_client: + + print( + "[test_inference_on_connection] Get an authenticated Azure OpenAI client for a connection AOAI service, and perform a chat completion operation." + ) + with project_client.inference.get_azure_openai_client( + api_version=api_version, connection_name=connection_name + ) as client: + + response = client.chat.completions.create( + model=model_deployment_name, + messages=[ + { + "role": "user", + "content": "How many feet are in a mile?", + }, + ], + ) + + print("Raw dump of response object: ") + pprint.pprint(response) + print("Response message: ", response.choices[0].message.content) + contains = ["5280", "5,280"] + assert any(item in response.choices[0].message.content for item in contains) diff --git a/sdk/ai/azure-ai-projects/tests/test_inference_async.py b/sdk/ai/azure-ai-projects/tests/test_inference_async.py new file mode 100644 index 000000000000..052ad7f151c7 --- /dev/null +++ b/sdk/ai/azure-ai-projects/tests/test_inference_async.py @@ -0,0 +1,101 @@ +# pylint: disable=line-too-long,useless-suppression +# ------------------------------------ +# Copyright (c) Microsoft Corporation. +# Licensed under the MIT License. +# ------------------------------------ +import pprint +import pytest +from azure.ai.projects.aio import AIProjectClient +from test_base import TestBase, servicePreparer +from devtools_testutils import is_live_and_not_recording +from devtools_testutils.aio import recorded_by_proxy_async + + +class TestInferenceAsync(TestBase): + + # To run this test, use the following command in the \sdk\ai\azure-ai-projects folder: + # cls & pytest tests\test_inference_async.py::TestInferenceAsync::test_inference_async -s + @servicePreparer() + @pytest.mark.skipif( + not is_live_and_not_recording(), + reason="Skipped because we cannot record chat completions call with AOAI client", + ) + @recorded_by_proxy_async + async def test_inference_async(self, **kwargs): + + endpoint = kwargs.pop("azure_ai_projects_tests_project_endpoint") + print("\n=====> Endpoint:", endpoint) + + model_deployment_name = self.test_inference_params["model_deployment_name"] + api_version = self.test_inference_params["aoai_api_version"] + + async with AIProjectClient( + endpoint=endpoint, + credential=self.get_credential(AIProjectClient, is_async=True), + ) as project_client: + + print( + "[test_inference_async] Get an authenticated Azure OpenAI client for the parent AI Services resource, and perform a chat completion operation." + ) + async with await project_client.inference.get_azure_openai_client(api_version=api_version) as client: + + response = await client.chat.completions.create( + model=model_deployment_name, + messages=[ + { + "role": "user", + "content": "How many feet are in a mile?", + }, + ], + ) + + print("Raw dump of response object: ") + pprint.pprint(response) + print("Response message: ", response.choices[0].message.content) + contains = ["5280", "5,280"] + assert any(item in response.choices[0].message.content for item in contains) + + # To run this test, use the following command in the \sdk\ai\azure-ai-projects folder: + # cls & pytest tests\test_inference_async.py::TestInferenceAsync::test_inference_on_connection_async -s + @servicePreparer() + @pytest.mark.skipif( + not is_live_and_not_recording(), + reason="Skipped because we cannot record chat completions call with AOAI client", + ) + @recorded_by_proxy_async + async def test_inference_on_connection_async(self, **kwargs): + + endpoint = kwargs.pop("azure_ai_projects_tests_project_endpoint") + print("\n=====> Endpoint:", endpoint) + + connection_name = self.test_inference_params["connection_name"] + model_deployment_name = self.test_inference_params["model_deployment_name"] + api_version = self.test_inference_params["aoai_api_version"] + + async with AIProjectClient( + endpoint=endpoint, + credential=self.get_credential(AIProjectClient, is_async=False), + ) as project_client: + + print( + "[test_inference_on_connection_async] Get an authenticated Azure OpenAI client for a connection AOAI service, and perform a chat completion operation." + ) + async with await project_client.inference.get_azure_openai_client( + api_version=api_version, connection_name=connection_name + ) as client: + + response = await client.chat.completions.create( + model=model_deployment_name, + messages=[ + { + "role": "user", + "content": "How many feet are in a mile?", + }, + ], + ) + + print("Raw dump of response object: ") + pprint.pprint(response) + print("Response message: ", response.choices[0].message.content) + contains = ["5280", "5,280"] + assert any(item in response.choices[0].message.content for item in contains) diff --git a/sdk/ai/azure-ai-projects/tests/test_telemetry.py b/sdk/ai/azure-ai-projects/tests/test_telemetry.py new file mode 100644 index 000000000000..5716366fc87e --- /dev/null +++ b/sdk/ai/azure-ai-projects/tests/test_telemetry.py @@ -0,0 +1,35 @@ +# ------------------------------------ +# Copyright (c) Microsoft Corporation. +# Licensed under the MIT License. +# ------------------------------------ + +from azure.ai.projects import AIProjectClient +from test_base import TestBase, servicePreparer +from devtools_testutils import recorded_by_proxy, is_live + + +class TestTelemetry(TestBase): + + # To run this test, use the following command in the \sdk\ai\azure-ai-projects folder: + # cls & pytest tests\test_telemetry.py::TestTelemetry::test_telemetry -s + @servicePreparer() + @recorded_by_proxy + def test_telemetry(self, **kwargs): + + endpoint = kwargs.pop("azure_ai_projects_tests_project_endpoint") + print("\n=====> Endpoint:", endpoint) + + with AIProjectClient( + endpoint=endpoint, + credential=self.get_credential(AIProjectClient, is_async=False), + ) as project_client: + + print("[test_telemetry] Get the Application Insights connection string:") + connection_string = project_client.telemetry.get_connection_string() + assert connection_string + if is_live(): + assert bool(self.REGEX_APPINSIGHTS_CONNECTION_STRING.match(connection_string)) + else: + assert connection_string == "Sanitized-api-key" + assert connection_string == project_client.telemetry.get_connection_string() # Test cached value + print("Application Insights connection string = " + connection_string) diff --git a/sdk/ai/azure-ai-projects/tests/test_telemetry_async.py b/sdk/ai/azure-ai-projects/tests/test_telemetry_async.py new file mode 100644 index 000000000000..86a96162b97a --- /dev/null +++ b/sdk/ai/azure-ai-projects/tests/test_telemetry_async.py @@ -0,0 +1,36 @@ +# ------------------------------------ +# Copyright (c) Microsoft Corporation. +# Licensed under the MIT License. +# ------------------------------------ + +from azure.ai.projects.aio import AIProjectClient +from test_base import TestBase, servicePreparer +from devtools_testutils.aio import recorded_by_proxy_async +from devtools_testutils import is_live + + +class TestTelemetryAsync(TestBase): + + # To run this test, use the following command in the \sdk\ai\azure-ai-projects folder: + # cls & pytest tests\test_telemetry_async.py::TestTelemetryAsync::test_telemetry_async -s + @servicePreparer() + @recorded_by_proxy_async + async def test_telemetry_async(self, **kwargs): + + endpoint = kwargs.pop("azure_ai_projects_tests_project_endpoint") + print("\n=====> Endpoint:", endpoint) + + async with AIProjectClient( + endpoint=endpoint, + credential=self.get_credential(AIProjectClient, is_async=True), + ) as project_client: + + print("[test_telemetry_async] Get the Application Insights connection string:") + connection_string = await project_client.telemetry.get_connection_string() + assert connection_string + if is_live(): + assert bool(self.REGEX_APPINSIGHTS_CONNECTION_STRING.match(connection_string)) + else: + assert connection_string == "Sanitized-api-key" + assert connection_string == await project_client.telemetry.get_connection_string() # Test cached value + print("Application Insights connection string = " + connection_string) diff --git a/sdk/ai/azure-ai-projects/tsp-location.yaml b/sdk/ai/azure-ai-projects/tsp-location.yaml index 22421471ffc1..8d4f6f4fcb05 100644 --- a/sdk/ai/azure-ai-projects/tsp-location.yaml +++ b/sdk/ai/azure-ai-projects/tsp-location.yaml @@ -1,4 +1,4 @@ directory: specification/ai/Azure.AI.Projects -commit: 07a63adf249cb199d5abd179448c92cd6e3446c8 +commit: 219bafeb4ef7d4bf9d47b9bcebfc9c6696c5f379 repo: Azure/azure-rest-api-specs additionalDirectories: