@@ -44,12 +44,64 @@ pip install git+https://github.com/cortexlabs/nucleus.git@master
44
44
45
45
``` bash
46
46
$ nucleus generate --help
47
- Usage: nucleus generate [OPTIONS] CONFIG
47
+ Usage: nucleus [OPTIONS] COMMAND [ARGS]...
48
48
49
- A Cortex utility to generate Dockerfile Nucleus model servers
49
+ Use the Nucleus CLI to generate model servers for Python-generic and
50
+ TensorFlow models. Compatible with Cortex clusters.
50
51
51
52
Options:
52
53
--help Show this message and exit.
54
+
55
+ Commands:
56
+ generate A utility to generate Dockerfile(s) for Nucleus model servers.
57
+ version Get Nucleus CLI version.
58
+ ```
59
+
60
+ ## Example
61
+
62
+ Generate model server Dockerfile
63
+
64
+ ``` bash
65
+ $ nucleus generate examples/rest-python-iris-classifier/model-server-config.yaml
66
+ -------------- nucleus model server config --------------
67
+ type: python
68
+ py_version: 3.6.9
69
+ path: handler.py
70
+ multi_model_reloading:
71
+ path: s3://cortex-examples/sklearn/iris-classifier/
72
+ use_local_cortex_libs: false
73
+ serve_port: 8080
74
+ processes: 1
75
+ threads_per_process: 1
76
+ max_concurrency: 0
77
+ dependencies:
78
+ pip: requirements.txt
79
+ conda: conda-packages.txt
80
+ shell: dependencies.sh
81
+ gpu_version: null
82
+ gpu: false
83
+
84
+ ---------------------------------------------------------
85
+ generating nucleus.Dockerfile dockerfile ...
86
+ ```
87
+
88
+ Build the aforementioned Docker image
89
+
90
+ ``` bash
91
+ $ docker build -f nucleus.Dockerfile -t nucleus .
92
+ ```
93
+
94
+ Run the Nucleus model server
95
+
96
+ ``` bash
97
+ $ docker run -it --rm -p 8080:8080 nucleus
98
+ ```
99
+
100
+ Finally, make a request to it
101
+
102
+ ``` bash
103
+ $ curl localhost:8080/ -X POST -H " Content-type: application/json" -d @examples/rest-python-iris-classifier/sample.json
104
+ {" prediction" : " setosa" , " model" : {" version" : " latest" }}
53
105
```
54
106
55
107
# Configuration
0 commit comments