You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: workloads/opea/chatqna/README.md
+3-3Lines changed: 3 additions & 3 deletions
Original file line number
Diff line number
Diff line change
@@ -11,8 +11,8 @@ The workload is based on the [OPEA ChatQnA Application](https://github.com/opea-
11
11
12
12
**Note**: Refer to [documentation](https://docs.openshift.com/container-platform/4.16/storage/index.html) for setting up other types of persistent storages.
* RHOAI is installed. Follow steps [here](../inference/README.md/#install-rhoai)
15
-
* The Intel Gaudi AI accelerator is enabled with RHOAI. Follow steps [here]((../inference/README.md/#enable-intel-gaudi-ai-accelerator-with-rhoai))
14
+
* RHOAI is installed. Follow steps [here](/e2e/inference/README.md/#install-rhoai)
15
+
* The Intel Gaudi AI accelerator is enabled with RHOAI. Follow steps [here](/e2e/inference/README.md/#enable-intel-gaudi-ai-accelerator-with-rhoai)
16
16
* Minio based S3 service ready for RHOAI. Follow steps [here](https://ai-on-openshift.io/tools-and-applications/minio/minio/#create-a-matching-data-connection-for-minio)
17
17
18
18
## Deploy Model Serving for OPEA ChatQnA Microservices with RHOAI
@@ -33,7 +33,7 @@ The workload is based on the [OPEA ChatQnA Application](https://github.com/opea-
33
33
34
34
### Launch the Model Serving with Intel Gaudi AI Accelerator
35
35
36
-
* Click on the Settings and choose ```ServingRuntime```. Copy or import the [tgi_gaudi_servingruntime.yaml](tgi-gaudi-servingruntime.yaml). The [tgi-gaudi](https://github.com/huggingface/tgi-gaudi) serving runtime is used. Follow the image below.
36
+
* Click on the Settings and choose ```ServingRuntime```. Copy or import the [tgi_gaudi_servingruntime.yaml](tgi_gaudi_servingruntime.yaml). The [tgi-gaudi](https://github.com/huggingface/tgi-gaudi) serving runtime is used. Follow the image below.
0 commit comments