diff --git a/README.md b/README.md
index 3c4b6f4..d294380 100644
--- a/README.md
+++ b/README.md
@@ -28,7 +28,7 @@ This solution uses the [Azure Functions OpenAI triggers and binding extension](h
- Create a chat session and interact with the OpenAI deployed model - Uses the Assistant bindings to interact wiht the OpenAI model and stores chat history in Azure storage tables automatically
- In the chat session, ask the LLM to store reminders and then later retrieve them. This capability is delivered by the AssistantSkills trigger in the OpenAI extension for Azure Functions
- Create Azure functions in different programming language e.g. (C#, Node, Python, Java, PowerShell) and easily replace using config file
-- Static web page is configured with AAD auth by default
+- Static web page is configured with Entra ID auth by default
@@ -46,59 +46,119 @@ This solution uses the [Azure Functions OpenAI triggers and binding extension](h
#### To Run Locally
-- [Azure Developer CLI](https://aka.ms/azure-dev/install)
-- [.NET 8](https://dotnet.microsoft.com/en-us/download/dotnet/8.0) - Backend Functions app is built using .NET 8
-- [Node.js](https://nodejs.org/en/download/) - Frontend is built in TypeScript
+- [.NET 8](https://dotnet.microsoft.com/en-us/download/dotnet/8.0) - `backend` Functions app is built using .NET 8
+- [Node.js](https://nodejs.org/en/download/) - `frontend` is built in TypeScript
+- [Azure Functions Core Tools](https://learn.microsoft.com/en-us/azure/azure-functions/functions-run-local?tabs=v4%2Clinux%2Ccsharp%2Cportal%2Cbash#install-the-azure-functions-core-tools) - Run and debug `backend` Functions locally
+- [Static Web Apps Cli](https://github.com/Azure/static-web-apps-cli#azure-static-web-apps-cli) - Run and debug `frontend` SWA locally
- [Git](https://git-scm.com/downloads)
+- [Azure Developer CLI](https://aka.ms/azure-dev/install) - Provision and deploy Azure Resources
- [Powershell 7+ (pwsh)](https://github.com/powershell/powershell) - For Windows users only.
- **Important**: Ensure you can run `pwsh.exe` from a PowerShell command. If this fails, you likely need to upgrade PowerShell.
-- [Static Web Apps Cli](https://github.com/Azure/static-web-apps-cli#azure-static-web-apps-cli)
-- [Azure Cli](https://learn.microsoft.com/en-us/cli/azure/install-azure-cli)
-- [Azure Functions Core Tools](https://learn.microsoft.com/en-us/azure/azure-functions/functions-run-local?tabs=v4%2Clinux%2Ccsharp%2Cportal%2Cbash#install-the-azure-functions-core-tools)
-> NOTE: Your Azure Account must have `Microsoft.Authorization/roleAssignments/write` permissions, such as [User Access Administrator](https://learn.microsoft.com/azure/role-based-access-control/built-in-roles#user-access-administrator) or [Owner](https://learn.microsoft.com/azure/role-based-access-control/built-in-roles#owner).
-### Installation
+> NOTE: Your Azure Account must have `Microsoft.Authorization/roleAssignments/write` permissions, such as [User Access Administrator](https://learn.microsoft.com/azure/role-based-access-control/built-in-roles#user-access-administrator) or [Owner](https://learn.microsoft.com/azure/role-based-access-control/built-in-roles#owner).
-#### Project Initialization
+### Initializing and deploying app
-1. Create a new folder and switch to it in the terminal
-1. Run `azd login`
-2. Run `az account set --subscription ""`
-3. Run `azd init`
- - For the target location, the regions that currently support the models used in this sample are **East US** or **South Central US**. For an up-to-date list of regions and models, check [here](https://learn.microsoft.com/en-us/azure/cognitive-services/openai/concepts/models). Make sure that all the intended services for this deployment have availability in your targeted regions.
+This application requires resources like Azure OpenAI and Azure AI Search which must be provisioned in Azure even if the app is run locally. The following steps make it easy to provision, deploy and configure all resources.
#### Starting from scratch
-Execute the following command, if you don't have any pre-existing Azure services and want to start from a fresh deployment.
-
-1. Run `azd up` - This will provision Azure resources and deploy this sample to those resources
-2. After the application has been successfully deployed you will see a URL printed to the console. Click that URL to interact with the application in your browser.
-
-> NOTE: It may take a minute for the application to be fully deployed.
+Execute the following command in a new terminal, if you don't have any pre-existing Azure services and want to start from a fresh deployment.
+
+1. Ensure your deployment scripts are executable (scripts are currently needed to help AZD deploy your app)
+
+Mac/Linux:
+```bash
+chmod +x ./scripts/deploy.sh
+```
+Windows:
+```Powershell
+set-executionpolicy remotesigned
+```
+2. Provision required Azure resources (e.g. Azure OpenAI and Azure Search) into a new environment
+```bash
+azd up
+```
+> NOTE: For the target location, the regions that currently support the models used in this sample are **East US** or **South Central US**. For an up-to-date list of regions and models, check [here](https://learn.microsoft.com/en-us/azure/cognitive-services/openai/concepts/models). Make sure that all the intended services for this deployment have availability in your targeted regions. Note also, it may take a minute for the application to be fully deployed.
+3. Navigate to the Azure Static WebApp deployed in step 2. The URL is printed out when azd completes (as "Endpoint"), or you can find it in the Azure portal.
#### Use existing resources
-1. Run `azd env set AZURE_OPENAI_SERVICE {Name of existing OpenAI service}`
-2. Run `azd env set AZURE_OPENAI_RESOURCE_GROUP {Name of existing resource group that OpenAI service is provisioned to}`
-3. Run `azd env set AZURE_OPENAI_CHATGPT_DEPLOYMENT {Name of existing ChatGPT deployment}`. Only needed if your ChatGPT deployment is not the default 'chat'.
-4. Run `azd env set AZURE_OPENAI_GPT_DEPLOYMENT {Name of existing GPT deployment}`. Only needed if your ChatGPT deployment is not the default 'davinci'.
-5. Run `azd up`
+The following steps let you override resource names and other values so you can leverage existing resources (e.g. provided by an admin or in a sandbox environment).
+
+1. Map configuration using Azure resources provided to you:
+```bash
+azd env set AZURE_OPENAI_SERVICE
+azd env set AZURE_OPENAI_RESOURCE_GROUP
+azd env set AZURE_OPENAI_CHATGPT_DEPLOYMENT
+azd env set AZURE_OPENAI_EMB_DEPLOYMENT
+azd env set AZURE_SEARCH_ENDPOINT
+azd env set AZURE_SEARCH_INDEX:
+azd env set fileShare
+azd env set ServiceBusConnection__fullyQualifiedNamespace
+azd env set ServiceBusQueueName
+azd env set OpenAiStorageConnection
+azd env set AzureWebJobsStorage__accountName
+azd env set DEPLOYMENT_STORAGE_CONNECTION_STRING
+azd env set APPLICATIONINSIGHTS_CONNECTION_STRING
+```
+2. Deploy all resources (provision any not specified)
+```bash
+azd up
+```
> NOTE: You can also use existing Search and Storage Accounts. See `./infra/main.parameters.json` for list of environment variables to pass to `azd env set` to configure those existing resources.
#### Deploying or re-deploying a local clone of the repo
-- Simply run `azd up`
-
-### Quickstart
-
-- In Azure: navigate to the Azure WebApp deployed by azd. The URL is printed out when azd completes (as "Endpoint"), or you can find it in the Azure portal.
-- Running locally: navigate to 127.0.0.1:5000
-
-Once in the web app:
-
-- Try different topics in chat or Q&A context. For chat, try follow up questions, clarifications, ask to simplify or elaborate on answer, etc.
+- Simply run `azd up` again
+
+### Running locally (currently untested/unsupported)
+
+Your frontend and backend apps can run on the local machine using storage emulators + remote AI resources.
+
+1. Initialize the Azure resources using one of the approaches above.
+2. Create a new `app/backend/local.settings.json` file to store Azure resource configuration using values in the .azure/[environment name]
+```json
+{
+ "IsEncrypted": false,
+ "Values": {
+ "AZURE_OPENAI_ENDPOINT": "",
+ "AZURE_OPENAI_CHATGPT_DEPLOYMENT": "chat",
+ "AZURE_OPENAI_EMB_DEPLOYMENT": "embedding",
+ "AZURE_SEARCH_ENDPOINT": "",
+ "AZURE_SEARCH_INDEX": "openai-index",
+ "fileShare": "/mounts/openaifiles",
+ "ServiceBusConnection__fullyQualifiedNamespace": "",
+ "ServiceBusQueueName": "",
+ "OpenAiStorageConnection": "",
+ "AzureWebJobsStorage__accountName": "",
+ "DEPLOYMENT_STORAGE_CONNECTION_STRING": "",
+ "APPLICATIONINSIGHTS_CONNECTION_STRING": "",
+ "SYSTEM_PROMPT": "You are a helpful assistant. You are responding to requests from a user about internal emails and documents. You can and should refer to the internal documents to help respond to requests. If a user makes a request thats not covered by the documents provided in the query, you must say that you do not have access to the information and not try and get information from other places besides the documents provided. The following is a list of documents that you can refer to when answering questions. The documents are in the format [filename]: [text] and are separated by newlines. If you answer a question by referencing any of the documents, please cite the document in your answer. For example, if you answer a question by referencing info.txt, you should add \"Reference: info.txt\" to the end of your answer on a separate line."
+ }
+}
+```
+3. Disable VNET private endpoints in resource group so your function can connect to remote resources (or VPN into VNET)
+4. Start Azurite using VS Code extension or run this command in a new terminal window using optional [Docker](www.docker.com)
+```bash
+docker run -p 10000:10000 -p 10001:10001 -p 10002:10002 \
+ mcr.microsoft.com/azure-storage/azurite
+```
+5. Start the Function app by pressing `F5` in Visual Studio (Code) or run this command:
+```bash
+func start
+```
+6. navigate to http://127.0.0.1:5000
+
+### Using the frontend web app:
+
+- Upload .txt files on the Upload screen. Content is provided in `./sample_content` folder.
+- In Ask screen, ask questions about uploaded data, e.g. `are eye exams covered?`
+- Explore the search indexes (e.g. `openai-index`) in the Azure AI Search resource to inspect vector embeddings created by the Upload step
+- In Chat screen, try follow up questions, clarifications, ask to simplify or elaborate on answer, etc.
+- In Chat screen, try skilling assistants by saying `create a todo to get a haircut` and `fetch me list of todos`.
- Explore citations and sources
- Click on "settings" to try different options, tweak prompts, etc.
@@ -108,11 +168,19 @@ Once in the web app:
- [Revolutionize your Enterprise Data with ChatGPT: Next-gen Apps w/ Azure OpenAI and AI Search](https://aka.ms/entgptsearchblog)
- [Azure AI Search](https://learn.microsoft.com/azure/search/search-what-is-azure-search)
- [Azure OpenAI Service](https://learn.microsoft.com/azure/cognitive-services/openai/overview)
+- [Azure Role-based-access-control](https://learn.microsoft.com/en-us/azure/role-based-access-control/built-in-roles)
-## How to purge aad auth
+## How to purge Entra ID auth
To remove your data from Azure Static Web Apps, go to
+## How to delete all Azure resources
+
+The following command deletes and purges all resources (this cannot be undone!):
+```bash
+azd down --purge
+```
+
## Upload files failures
Currently only text files are supported.
@@ -120,4 +188,6 @@ Currently only text files are supported.
## Azure Functions troubleshooting
Go to Application Insights and go to the Live metrics view to see real time telemtry information.
-Optionally, go to Application Insights and select Logs and view the traces table
+Optionally, go to Application Insights and select Logs and view the traces table, or view Transaction Search.
+
+If no functions load, double check that you get no errors on `azd up` (e.g. script error with no permission to execute). Also if `azd package` or `azd up` fails with `Can't determine Project to build. Expected 1 .csproj or .fsproj but found 2` error, delete the `/app/backend/bin` and `/app/backend/obj` folders and try to deploy again with `azd package` or `azd up`.
diff --git a/app/backend/Upload.cs b/app/backend/Upload.cs
index eadd1cb..590e965 100644
--- a/app/backend/Upload.cs
+++ b/app/backend/Upload.cs
@@ -77,9 +77,9 @@ public class EmbeddingsStoreOutputResponse
[EmbeddingsStoreOutput(
"{FileName}",
InputType.FilePath,
- "AISearchEndpoint",
+ "AZURE_SEARCH_ENDPOINT",
"openai-index",
- Model = "%EMBEDDING_MODEL_DEPLOYMENT_NAME%"
+ Model = "%AZURE_OPENAI_EMB_DEPLOYMENT%"
)]
public required SearchableDocument SearchableDocument { get; init; }
}
diff --git a/app/backend/ask.cs b/app/backend/ask.cs
index b5ba5d7..571aa47 100644
--- a/app/backend/ask.cs
+++ b/app/backend/ask.cs
@@ -20,11 +20,11 @@ public Ask(ILogger logger)
public IActionResult AskData(
[HttpTrigger(AuthorizationLevel.Anonymous, Route = "ask")] HttpRequestData req,
[SemanticSearchInput(
- "AISearchEndpoint",
- "openai-index",
+ "AZURE_SEARCH_ENDPOINT",
+ "%AZURE_SEARCH_INDEX%",
Query = "{question}",
- ChatModel = "%CHAT_MODEL_DEPLOYMENT_NAME%",
- EmbeddingsModel = "%EMBEDDING_MODEL_DEPLOYMENT_NAME%",
+ ChatModel = "%AZURE_OPENAI_CHATGPT_DEPLOYMENT%",
+ EmbeddingsModel = "%AZURE_OPENAI_EMB_DEPLOYMENT%",
SystemPrompt = "%SYSTEM_PROMPT%"
)]
SemanticSearchContext result
diff --git a/app/backend/chat.cs b/app/backend/chat.cs
index 047f64b..5e0a5f5 100644
--- a/app/backend/chat.cs
+++ b/app/backend/chat.cs
@@ -52,7 +52,7 @@ public static PostResponseOutput ChatQuery(
[AssistantPostInput(
"{assistantId}",
"{prompt}",
- Model = "%CHAT_MODEL_DEPLOYMENT_NAME%"
+ Model = "%AZURE_OPENAI_CHATGPT_DEPLOYMENT%"
)]
AssistantState state
)
diff --git a/azure.yaml b/azure.yaml
index f160eba..f4042f4 100644
--- a/azure.yaml
+++ b/azure.yaml
@@ -2,7 +2,7 @@
name: azure-search-openai-demo
metadata:
- template: azure-search-openai-demo@0.0.2-beta
+ template: azure-search-openai-demo@0.0.3-beta
hooks:
postprovision:
windows:
diff --git a/infra/app/processor.bicep b/infra/app/processor.bicep
index 731e68d..acd6558 100644
--- a/infra/app/processor.bicep
+++ b/infra/app/processor.bicep
@@ -13,8 +13,9 @@ param maximumInstanceCount int = 100
param azureOpenaiService string
param appInsightsConnectionString string
param azureOpenaiChatgptDeployment string
-param azureOpenaigptDeployment string
+param azureOpenaiEmbeddingDeployment string
param azureSearchService string
+param azureSearchIndex string
param serviceBusQueueName string
param serviceBusNamespaceFQDN string
param shareName string
@@ -31,11 +32,14 @@ module processor '../core/host/functions-flexconsumption.bicep' = {
tags: union(tags, { 'azd-service-name': serviceName })
appSettings: union(appSettings,
{
+ AZURE_OPENAI_SERVICE: azureOpenaiService
AZURE_OPENAI_ENDPOINT: 'https://${azureOpenaiService}.openai.azure.com/'
- CHAT_MODEL_DEPLOYMENT_NAME: azureOpenaiChatgptDeployment
- EMBEDDING_MODEL_DEPLOYMENT_NAME: azureOpenaigptDeployment
+ AZURE_OPENAI_CHATGPT_DEPLOYMENT: azureOpenaiChatgptDeployment
+ AZURE_OPENAI_EMB_DEPLOYMENT: azureOpenaiEmbeddingDeployment
SYSTEM_PROMPT: 'You are a helpful assistant. You are responding to requests from a user about internal emails and documents. You can and should refer to the internal documents to help respond to requests. If a user makes a request thats not covered by the documents provided in the query, you must say that you do not have access to the information and not try and get information from other places besides the documents provided. The following is a list of documents that you can refer to when answering questions. The documents are in the format [filename]: [text] and are separated by newlines. If you answer a question by referencing any of the documents, please cite the document in your answer. For example, if you answer a question by referencing info.txt, you should add "Reference: info.txt" to the end of your answer on a separate line.'
- AISearchEndpoint: 'https://${azureSearchService}.search.windows.net'
+ AZURE_SEARCH_SERVICE: azureSearchService
+ AZURE_SEARCH_ENDPOINT: 'https://${azureSearchService}.search.windows.net'
+ AZURE_SEARCH_INDEX: azureSearchIndex
fileShare : '/mounts/${shareName}'
//OpenAI extension not yet supports MSI for the table storage connection
OpenAiStorageConnection: 'DefaultEndpointsProtocol=https;AccountName=${stg.name};EndpointSuffix=${environment().suffixes.storage};AccountKey=${stg.listKeys().keys[0].value}'
diff --git a/infra/main.bicep b/infra/main.bicep
index 86de423..fcd9be3 100644
--- a/infra/main.bicep
+++ b/infra/main.bicep
@@ -19,9 +19,9 @@ param azFunctionHostingPlanType string = 'flexconsumption'
param staticWebsiteName string = ''
param searchServiceName string = ''
-
-param searchServiceSkuName string = 'standard'
-param searchIndexName string = 'gptkbindex'
+@allowed([ 'free', 'basic', 'standard', 'standard2', 'standard3', 'storage_optimized_l1', 'storage_optimized_l2' ])
+param searchServiceSkuName string
+param searchServiceIndexName string
param storageAccountName string = ''
@@ -30,11 +30,33 @@ param serviceBusNamespaceName string = ''
param openAiServiceName string = ''
-param openAiSkuName string = 'S0'
-param gptDeploymentName string = 'text-embedding-3-small'
-param gptModelName string = 'text-embedding-3-small'
-param chatGptDeploymentName string = 'chat'
-param chatGptModelName string = 'gpt-35-turbo'
+param openAiSkuName string
+@allowed([ 'azure', 'openai', 'azure_custom' ])
+param openAiHost string // Set in main.parameters.json
+
+param chatGptModelName string = ''
+param chatGptDeploymentName string = ''
+param chatGptDeploymentVersion string = ''
+param chatGptDeploymentCapacity int = 0
+var chatGpt = {
+ modelName: !empty(chatGptModelName) ? chatGptModelName : startsWith(openAiHost, 'azure') ? 'gpt-35-turbo' : 'gpt-3.5-turbo'
+ deploymentName: !empty(chatGptDeploymentName) ? chatGptDeploymentName : 'chat'
+ deploymentVersion: !empty(chatGptDeploymentVersion) ? chatGptDeploymentVersion : '0613'
+ deploymentCapacity: chatGptDeploymentCapacity != 0 ? chatGptDeploymentCapacity : 40
+}
+
+param embeddingModelName string = ''
+param embeddingDeploymentName string = ''
+param embeddingDeploymentVersion string = ''
+param embeddingDeploymentCapacity int = 0
+param embeddingDimensions int = 0
+var embedding = {
+ modelName: !empty(embeddingModelName) ? embeddingModelName : 'text-embedding-3-small'
+ deploymentName: !empty(embeddingDeploymentName) ? embeddingDeploymentName : 'embedding'
+ deploymentVersion: !empty(embeddingDeploymentVersion) ? embeddingDeploymentVersion : '1'
+ deploymentCapacity: embeddingDeploymentCapacity != 0 ? embeddingDeploymentCapacity : 300
+ dimensions: embeddingDimensions != 0 ? embeddingDimensions : 1536
+}
// @description('Id of the user or app to assign application roles')
// param principalId string = ''
@@ -81,24 +103,24 @@ module openAi 'core/ai/cognitiveservices.bicep' = {
}
deployments: [
{
- name: gptDeploymentName
- capacity: 300
+ name: embedding.deploymentName
+ capacity: embedding.deploymentCapacity
model: {
format: 'OpenAI'
- name: gptModelName
- version: '1'
+ name: embedding.modelName
+ version: embedding.deploymentVersion
}
scaleSettings: {
scaleType: 'Standard'
}
}
{
- name: chatGptDeploymentName
- capacity: 40
+ name: chatGpt.deploymentName
+ capacity: chatGpt.deploymentCapacity
model: {
format: 'OpenAI'
- name: chatGptModelName
- version: '0613'
+ name: chatGpt.modelName
+ version: chatGpt.deploymentVersion
}
scaleSettings: {
scaleType: 'Standard'
@@ -172,7 +194,7 @@ module function 'core/host/azfunctions.bicep' = if (azFunctionHostingPlanType ==
location: location
appServicePlanId: appServicePlan.outputs.id
azureOpenaiChatgptDeployment: chatGptDeploymentName
- azureOpenaigptDeployment: gptDeploymentName
+ azureOpenaigptDeployment: embeddingDeploymentName
azureOpenaiService: openAi.outputs.name
azureSearchService: searchService.outputs.name
appInsightsConnectionString : appInsights.outputs.connectionString
@@ -195,9 +217,10 @@ module functionflexconsumption 'app/processor.bicep' = if (azFunctionHostingPlan
storageAccountName: storage.outputs.name
appInsightsConnectionString : appInsights.outputs.connectionString
azureOpenaiChatgptDeployment: chatGptDeploymentName
- azureOpenaigptDeployment: gptDeploymentName
+ azureOpenaiEmbeddingDeployment: embeddingDeploymentName
azureOpenaiService: openAi.outputs.name
azureSearchService: searchService.outputs.name
+ azureSearchIndex: searchServiceIndexName
serviceBusQueueName: serviceBus.outputs.serviceBusQueueName
serviceBusNamespaceFQDN: serviceBus.outputs.serviceBusNamespaceFQDN
appSettings: {
@@ -230,39 +253,34 @@ module staticwebsite 'core/host/staticwebsite.bicep' = {
}
}
+// Learn more about Azure role-based access control (RBAC) and built-in-roles at https://docs.microsoft.com/en-us/azure/role-based-access-control/overview
+var CognitiveServicesRoleDefinitionIds = ['5e0bd9bd-7b93-4f28-af87-19fc36ad61bd'] // Cognitive Services OpenAI User
module openAiRoleUser 'app/openai-access.bicep' = {
scope: resourceGroup
name: 'openai-roles'
params: {
principalId: processorAppPrincipalId
openAiAccountResourceName: openAi.outputs.name
- roleDefinitionIds: ['5e0bd9bd-7b93-4f28-af87-19fc36ad61bd']
+ roleDefinitionIds: CognitiveServicesRoleDefinitionIds
}
}
+var StorageRoleDefinitionIds = ['b7e6dc6d-f1e8-4753-8033-0f276bb0955b' // Storage Blob Data Owner
+ '974c5e8b-45b9-4653-ba55-5f855dd0fb88' // Storage Queue Data Contributor
+ '0a9a7e1f-b9d0-4cc4-a60d-0319b160aaa3' // Storage Table Data Contributor
+ '0c867c2a-1d8c-454a-a3db-ab2ea1bdc8bb'] // Storage File Data SMB Share Contributor
module storageRoleUser 'app/storage-access.bicep' = {
scope: resourceGroup
name: 'storage-roles'
params: {
principalId: processorAppPrincipalId
- //This list can likely be reduced to just the roles needed
- roleDefinitionIds: ['b7e6dc6d-f1e8-4753-8033-0f276bb0955b'
- '2a2b9908-6ea1-4ae2-8e65-a410df84e7d1'
- 'ba92f5b4-2d11-453d-a403-e96b0029c9fe'
- '974c5e8b-45b9-4653-ba55-5f855dd0fb88'
- '8a0f0c08-91a1-4084-bc3d-661d67233fed'
- 'c6a89b2d-59bc-44d0-9896-0f6e12d7b80a'
- '19e7f393-937e-4f77-808e-94535e297925'
- '0a9a7e1f-b9d0-4cc4-a60d-0319b160aaa3'
- '76199698-9eea-4c19-bc75-cec21354c6b6'
- '0c867c2a-1d8c-454a-a3db-ab2ea1bdc8bb'
- 'aba4ae5f-2193-4029-9191-0cb91df5e314']
+ roleDefinitionIds: StorageRoleDefinitionIds
storageAccountName: storage.outputs.name
}
}
-var ServiceBusRoleDefinitionIds = ['090c5cfd-751d-490a-894a-3ce6f1109419', '4f6d3b9b-027b-4f4c-9142-0e5a2a2247e0'] //Azure Service Bus Data Owner and Data Receiver roles
-// Allow access from processor to Service Bus using a managed identity and Azure Service Bus Data Owner and Data Receiver roles
+var ServiceBusRoleDefinitionIds = ['090c5cfd-751d-490a-894a-3ce6f1109419'] // Azure Service Bus Data Owner
+// Allow access from processor to Service Bus using a managed identity and Azure Service Bus Data Owner
module ServiceBusDataOwnerRoleAssignment 'app/servicebus-Access.bicep' = {
name: 'ServiceBusDataOwnerRoleAssignment'
scope: resourceGroup
@@ -273,12 +291,15 @@ module ServiceBusDataOwnerRoleAssignment 'app/servicebus-Access.bicep' = {
}
}
+var SearchRoleDefinitionIds = ['8ebe5a00-799e-43f5-93ac-243d3dce84a7' // Azure Search Index Data Contributor
+ '7ca78c08-252a-4471-8644-bb5ff32d4ba0' // Azure Search Service Contributor (required if index is not yet created)
+ ]
module searchRoleUser 'app/search-access.bicep' = {
scope: resourceGroup
name: 'search-roles'
params: {
principalId: processorAppPrincipalId
- roleDefinitionIds: ['7ca78c08-252a-4471-8644-bb5ff32d4ba0', '8ebe5a00-799e-43f5-93ac-243d3dce84a7', '1407120a-92aa-4202-b7e9-c0e197c71c8f']
+ roleDefinitionIds: SearchRoleDefinitionIds
searchAccountName: searchService.outputs.name
}
}
@@ -310,11 +331,11 @@ output AZURE_TENANT_ID string = tenant().tenantId
output AZURE_RESOURCE_GROUP string = resourceGroup.name
output AZURE_OPENAI_SERVICE string = openAi.outputs.name
-output AZURE_OPENAI_GPT_DEPLOYMENT string = gptDeploymentName
+output AZURE_OPENAI_EMB_DEPLOYMENT string = embeddingDeploymentName
output AZURE_OPENAI_CHATGPT_DEPLOYMENT string = chatGptDeploymentName
output AZURE_OPENAI_LOCATION string = openAi.outputs.location
-output AZURE_SEARCH_INDEX string = searchIndexName
+output AZURE_SEARCH_INDEX string = searchServiceIndexName
output AZURE_SEARCH_SERVICE string = searchService.outputs.name
output AZURE_STORAGE_ACCOUNT string = storage.outputs.name
diff --git a/infra/main.parameters.json b/infra/main.parameters.json
index 2b8beec..3762eb9 100644
--- a/infra/main.parameters.json
+++ b/infra/main.parameters.json
@@ -11,14 +11,41 @@
"principalId": {
"value": "${AZURE_PRINCIPAL_ID}"
},
+ "openAiSkuName": {
+ "value": "S0"
+ },
"openAiServiceName": {
"value": "${AZURE_OPENAI_SERVICE}"
},
+ "openAiHost": {
+ "value": "${OPENAI_HOST=azure}"
+ },
"openAiResourceGroupName": {
"value": "${AZURE_OPENAI_RESOURCE_GROUP}"
},
- "openAiSkuName": {
- "value": "S0"
+ "chatGptDeploymentName": {
+ "value": "${AZURE_OPENAI_CHATGPT_DEPLOYMENT=chat}"
+ },
+ "chatGptDeploymentCapacity":{
+ "value": "${AZURE_OPENAI_CHATGPT_DEPLOYMENT_CAPACITY}"
+ },
+ "chatGptDeploymentVersion":{
+ "value": "${AZURE_OPENAI_CHATGPT_DEPLOYMENT_VERSION}"
+ },
+ "chatGptModelName":{
+ "value": "${AZURE_OPENAI_CHATGPT_MODEL=gpt-35-turbo}"
+ },
+ "embeddingDeploymentName": {
+ "value": "${AZURE_OPENAI_EMB_DEPLOYMENT=embedding}"
+ },
+ "embeddingModelName":{
+ "value": "${AZURE_OPENAI_EMB_MODEL_NAME=text-embedding-3-small}"
+ },
+ "embeddingDeploymentVersion":{
+ "value": "${AZURE_OPENAI_EMB_DEPLOYMENT_VERSION}"
+ },
+ "embeddingDeploymentCapacity":{
+ "value": "${AZURE_OPENAI_EMB_DEPLOYMENT_CAPACITY}"
},
"searchServiceName": {
"value": "${AZURE_SEARCH_SERVICE}"
@@ -26,8 +53,11 @@
"searchServiceResourceGroupName": {
"value": "${AZURE_SEARCH_SERVICE_RESOURCE_GROUP}"
},
+ "searchServiceIndexName": {
+ "value": "${AZURE_SEARCH_INDEX=openai-index}"
+ },
"searchServiceSkuName": {
- "value": "standard"
+ "value": "${AZURE_SEARCH_SERVICE_SKU=standard}"
},
"storageAccountName": {
"value": "${AZURE_STORAGE_ACCOUNT}"
@@ -37,6 +67,9 @@
},
"azFunctionHostingPlanType": {
"value": "flexconsumption"
+ },
+ "systemPrompt": {
+ "value": "${SYSTEM_PROMPT}=You are a helpful assistant. You are responding to requests from a user about internal emails and documents. You can and should refer to the internal documents to help respond to requests. If a user makes a request thats not covered by the documents provided in the query, you must say that you do not have access to the information and not try and get information from other places besides the documents provided. The following is a list of documents that you can refer to when answering questions. The documents are in the format [filename]: [text] and are separated by newlines. If you answer a question by referencing any of the documents, please cite the document in your answer. For example, if you answer a question by referencing info.txt, you should add \"Reference: info.txt\" to the end of your answer on a separate line."
}
}
}
diff --git a/scripts/deploy.sh b/scripts/deploy.sh
old mode 100644
new mode 100755