You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+9-4Lines changed: 9 additions & 4 deletions
Original file line number
Diff line number
Diff line change
@@ -32,11 +32,12 @@ The model is based on the [Keras built-in model for Inception-ResNet-v2](https:/
32
32
*`docker`: The [Docker](https://www.docker.com/) command-line interface. Follow the [installation instructions](https://docs.docker.com/install/) for your system.
33
33
* The minimum recommended resources for this model is 2GB Memory and 2 CPUs.
34
34
35
-
# Steps
35
+
# Deployment options
36
36
37
-
1.[Deploy from Docker Hub](#deploy-from-docker-hub)
38
-
2.[Deploy on Kubernetes](#deploy-on-kubernetes)
39
-
3.[Run Locally](#run-locally)
37
+
*[Deploy from Docker Hub](#deploy-from-docker-hub)
38
+
*[Deploy on Red Hat OpenShift](#deploy-on-red-hat-openshift)
This will pull a pre-built image from Docker Hub (or use an existing image if already cached locally) and run it.
50
51
If you'd rather checkout and build the model locally you can follow the [run locally](#run-locally) steps below.
51
52
53
+
## Deploy on Red Hat OpenShift
54
+
55
+
You can deploy the model-serving microservice on Red Hat OpenShift by following the instructions for the OpenShift web console or the OpenShift Container Platform CLI [in this tutorial](https://developer.ibm.com/tutorials/deploy-a-model-asset-exchange-microservice-on-red-hat-openshift/), specifying `codait/max-inception-resnet-v2` as the image name.
56
+
52
57
## Deploy on Kubernetes
53
58
54
59
You can also deploy the model on Kubernetes using the latest docker image on Docker Hub.
0 commit comments