You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
* Update install instructions
Signed-off-by: Michael Yuan <michael@secondstate.io>
* Clarify the TF Lite plugin status
Signed-off-by: Michael Yuan <michael@secondstate.io>
---------
Signed-off-by: Michael Yuan <michael@secondstate.io>
Copy file name to clipboardExpand all lines: docs/start/install.md
+90-70Lines changed: 90 additions & 70 deletions
Original file line number
Diff line number
Diff line change
@@ -50,16 +50,16 @@ Suppose you are interested in the latest builds from the `HEAD` of the `master`
50
50
51
51
#### Install WasmEdge with plug-ins
52
52
53
-
WasmEdge plug-ins are pre-built native modules that provide additional functionalities to the WasmEdge Runtime. To install plug-ins with the runtime, you can pass the `--plugins` parameter in the installer. For example, the command below installs the `WASI-NN TensorFlow-Lite backend` plug-in, which allows WasmEdge apps to run inference on Tensorflow-Lite models with the `WASI-NN` proposal.
53
+
WasmEdge plug-ins are pre-built native modules that provide additional functionalities to the WasmEdge Runtime. To install plug-ins with the runtime, you can pass the `--plugins` parameter in the installer. For example, the command below installs the `wasmedge_rustls` plug-into enable TLS and HTTPS networking.
To install multiple plug-ins, you can pass a list of plug-ins with the `--plugins` option. For example, the following command installs the `wasi-nn TensorFlow-Lite backend` and the `wasmedge_tensorflow` plug-ins.
59
+
To install multiple plug-ins, you can pass a list of plug-ins with the `--plugins` option. For example, the following command installs the `wasmedge_rustls` and the `wasi_nn-ggml` plug-ins. The latter enables WasmEdge to run AI inference programs on large language models such as llama2 family of LLMs.
The installer downloads the plug-in files from the WasmEdge release on GitHub, unzips them, and then copies them over to the `~/.wasmedge/plugin/` folder (for user install) and to the `/usr/local/lib/wasmedge/` folder (for system install).
@@ -125,36 +125,32 @@ WasmEdge uses plug-ins to extend its functionality. If you want to use more of W
125
125
126
126
### TLS plug-in
127
127
128
-
The WasmEdge TLS plug-in utilizes the native OpenSSL library to support HTTPS and TLS requests from WasmEdge sockets. To install the WasmEdge TLS plug-in on Linux, run the following commands after you have installed WasmEdge.
128
+
The WasmEdge TLS plug-in utilizes the native OpenSSL library to support HTTPS and TLS requests from WasmEdge sockets. To install WasmEdge with the TLS plug-in, run the following command.
Then, go to [HTTPS request in Rust chapter](../develop/rust/http_service/client.md) to see how to run HTTPs services with Rust.
143
135
144
-
### WASI-NN plug-in
136
+
### WASI-NN plug-ins
145
137
146
-
WasmEdge supports various backends for `WASI-NN`.
138
+
WasmEdge supports various backends for `WASI-NN`, which provides a standardized API for WasmEdge applications to access AI models for inference. Each backend supports a specific type of AI models.
147
139
148
-
-[ggml backend](#wasi-nn-plug-in-with-ggml-backend): supported on `Ubuntu above 20.04` (x86_64), macOS (M1 and M2), and GPU (NVIDIA).
149
-
-[PyTorch backend](#wasi-nn-plug-in-with-pytorch-backend): supported on `Ubuntu above 20.04` and `manylinux2014_x86_64`.
150
-
-[OpenVINO™ backend](#wasi-nn-plug-in-with-openvino-backend): supported on `Ubuntu above 20.04`.
151
-
-[TensorFlow-Lite backend](#wasi-nn-plug-in-with-tensorflow-lite-backend): supported on `Ubuntu above 20.04`, `manylinux2014_x86_64`, and `manylinux2014_aarch64`.
140
+
-[ggml backend](#wasi-nn-plug-in-with-ggml-backend): supported on `Ubuntu 20.04+`and macOS.
141
+
-[PyTorch backend](#wasi-nn-plug-in-with-pytorch-backend): supported on `Ubuntu 20.04+` and `manylinux2014_x86_64`.
142
+
-[OpenVINO™ backend](#wasi-nn-plug-in-with-openvino-backend): supported on `Ubuntu 20.04+`.
143
+
-[TensorFlow-Lite backend](#wasi-nn-plug-in-with-tensorflow-lite-backend): supported on `Ubuntu 20.04+`, `manylinux2014_x86_64`, and `manylinux2014_aarch64`.
152
144
153
145
Noticed that the backends are exclusive. Developers can only choose and install one backend for the `WASI-NN` plug-in.
154
146
155
147
#### WASI-NN plug-in with ggml backend
156
148
157
-
`WASI-NN plug-in` with `ggml` backend allows WasmEdge to run llama2 inference. To install WasmEdge with WASI-NN ggml backend on, please use `--plugin wasi_nn-ggml` when running the installer command.
149
+
The WASI-NN plug-in with ggml backend allows WasmEdge to run llama2 inference. To install WasmEdge with WASI-NN ggml backend, please pass the `wasi_nn-ggml` option to the `--plugins` flag when running the installer command.
Please note, the installer from WasmEdge 0.13.5 will detect CUDA automatically. If CUDA is detected, the installer will always attempt to install a CUDA-enabled version of the plug-in.
160
156
@@ -168,9 +164,13 @@ Then, go to the [Llama2 inference in Rust chapter](../develop/rust/wasinn/llm_in
168
164
169
165
#### WASI-NN plug-in with PyTorch backend
170
166
171
-
`WASI-NN` plug-in with `PyTorch` backend allows WasmEdge applications to perform `PyTorch` model inference. To install WasmEdge with `WASI-NN PyTorch backend` plug-in on Linux, please use the `--plugins wasi_nn-pytorch` parameter when [running the installer command](#generic-linux-and-macos).
167
+
The WASI-NN plug-in with PyTorch backend allows WasmEdge applications to perform PyTorch model inference. To install WasmEdge with WASI-NN PyTorch backend, please pass the `wasi_nn-pytorch` option to the `--plugins` flag when running the installer command.
172
168
173
-
The `WASI-NN` plug-in with `PyTorch` backend depends on the `libtorch` C++ library to perform AI/ML computations. You need to install the [PyTorch 1.8.2 LTS](https://pytorch.org/get-started/locally/) dependencies for it to work properly.
The WASI-NN plug-in with PyTorch backend depends on the `libtorch` C++ library to perform AI/ML computations. You need to install the [PyTorch 1.8.2 LTS](https://pytorch.org/get-started/locally/) dependencies for it to work properly.
174
174
175
175
```bash
176
176
export PYTORCH_VERSION="1.8.2"
@@ -194,9 +194,13 @@ Then, go to the [WASI-NN PyTorch backend in Rust chapter](../develop/rust/wasinn
194
194
195
195
#### WASI-NN plug-in with OpenVINO backend
196
196
197
-
`WASI-NN` plug-in with `OpenVINO™` backend allows WasmEdge applications to perform `OpenVINO™` model inference. To install WasmEdge with `WASI-NN OpenVINO™ backend` plug-in on Linux, please use the `--plugins wasi_nn-openvino` parameter when [running the installer command](#generic-linux-and-macos).
197
+
The WASI-NN plug-in with the OpenVINO backend allows WasmEdge applications to perform OpenVINO model inference. To install WasmEdge with WASI-NN OpenVINO backend, please pass the `wasi_nn-openvino` option to the `--plugins` flag when running the installer command.
The `WASI-NN` plug-in with `OpenVINO™` backend depends on the `OpenVINO™` C library to perform AI/ML computations. [OpenVINO™](https://docs.openvino.ai/2023.0/openvino_docs_install_guides_installing_openvino_apt.html)(2023) dependencies. The following instructions are for Ubuntu 20.04 and above.
203
+
The WASI-NN plug-in with OpenVINO backend depends on the OpenVINO C library to perform AI/ML computations. [OpenVINO](https://docs.openvino.ai/2023.0/openvino_docs_install_guides_installing_openvino_apt.html)(2023) dependencies. The following instructions are for Ubuntu 20.04 and above.
Then, go to the [WASI-NN OpenVINO™ backend in Rust](../develop/rust/wasinn/openvino) chapter to see how to run AI inference with `OpenVINO™`.
214
+
Then, go to the [WASI-NN OpenVINO backend in Rust](../develop/rust/wasinn/openvino) chapter to see how to run AI inference with `OpenVINO.
211
215
212
216
#### WASI-NN plug-in with TensorFlow-Lite backend
213
217
214
-
`WASI-NN` plug-in with `Tensorflow-Lite` backend allows WasmEdge applications to perform `Tensorflow-Lite` model inference. To install WasmEdge with `WASI-NN Tensorflow-Lite backend` plug-in on Linux, please use the `--plugins wasi_nn-tensorflowlite` parameter when [running the installer command](#generic-linux-and-macos).
218
+
The WASI-NN plug-in with Tensorflow-Lite backend allows WasmEdge applications to perform Tensorflow-Lite model inference. To install WasmEdge with WASI-NN Tensorflow-Lite backend, please pass the `wasi_nn-tensorflowlite` option to the `--plugins` flag when running the installer command.
215
219
216
-
The `WASI-NN` plug-in with `Tensorflow-Lite` backend depends on the `libtensorflowlite_c` shared library to perform AI/ML computations, and it will be installed by the installer automatically.
The WASI-NN plug-in with Tensorflow-Lite backend depends on the `libtensorflowlite_c` shared library to perform AI/ML computations, and it will be installed by the installer automatically.
217
225
218
226
<!-- prettier-ignore -->
219
227
:::note
220
228
If you install this plug-in WITHOUT installer, you can [refer to here to install the dependency](#tensorflow-lite-dependencies).
221
229
:::note
222
230
223
-
Then, go to [WASI-NN TensorFlow-lite backend in Rust chapter](../develop/rust/wasinn/tensorflow_lite) to see how to run AI inference with `TensorFlow-Lite`.
231
+
Then, go to [WASI-NN TensorFlow-lite backend in Rust chapter](../develop/rust/wasinn/tensorflow_lite) to see how to run AI inference with TensorFlow-Lite.
224
232
225
233
### WASI-Crypto Plug-in
226
234
227
235
[WASI-crypto](https://github.com/WebAssembly/wasi-crypto) is Cryptography API proposals for WASI. To use WASI-Crypto proposal, please use the `--plugins wasi_crypto` parameter when [running the installer command](#generic-linux-and-macos).
228
236
229
-
Then, go to [WASI-Crypto in Rust chapter](../develop/rust/wasicrypto.md) to see how to run `WASI-crypto` functions.
Then, go to [WASI-Crypto in Rust chapter](../develop/rust/wasicrypto.md) to see how to run WASI-crypto functions.
230
242
231
243
### WasmEdge Image Plug-in
232
244
233
245
The wasmEdge-Image plug-in can help developers to load and decode JPEG and PNG images and convert into tensors. To install this plug-in, please use the `--plugins wasmedge_image` parameter when [running the installer command](#generic-linux-and-macos).
234
246
235
-
Then, go to [TensorFlow interface (image part) in Rust chapter](../develop/rust/wasinn/tf_plugin.md#image-loading-and-conversion) to see how to run `WasmEdge-Image` functions.
Then, go to [TensorFlow interface (image part) in Rust chapter](../develop/rust/wasinn/tf_plugin.md#image-loading-and-conversion) to see how to run WasmEdge-Image functions.
236
252
237
253
### WasmEdge TensorFlow Plug-in
238
254
239
-
WasmEdge-TensorFlow plug-in can help developers to perform `TensorFlow` model inference as the similar API in python. To install this plug-in, please use the `--plugins wasmedge_tensorflow` parameter when [running the installer command](#generic-linux-and-macos).
255
+
The WasmEdge-TensorFlow plug-in can help developers to perform TensorFlow model inference as the similar API in python. To install this plug-in, please use the `--plugins wasmedge_tensorflow` parameter when [running the installer command](#generic-linux-and-macos).
The WasmEdge-Tensorflow plug-in depends on the `libtensorflow_cc` shared library.
242
262
@@ -249,16 +269,16 @@ Then, go to [TensorFlow interface in Rust chapter](../develop/rust/wasinn/tf_plu
249
269
250
270
### WasmEdge TensorFlow-Lite Plug-in
251
271
252
-
The wasmEdge-TensorFlowLite plug-in can help developers to perform `TensorFlow-Lite` model inference as the similar API in python. To install this plug-in, please use the `--plugins wasmedge_tensorflowlite` parameter when [running the installer command](#generic-linux-and-macos).
253
-
254
-
The WasmEdge-TensorflowLite plug-in depends on the `libtensorflowlite_c` shared library to perform AI/ML computations, and it will be installed by the installer automatically.
255
-
256
272
<!-- prettier-ignore -->
257
273
:::note
258
-
If you install this plug-in WITHOUT installer, you can [refer to here to install the dependency](#tensorflow-lite-dependencies).
274
+
The Tensorflow Lite plugin is being deprecated. Please use the [WASI NN TensorflowLite plugin](#wasi-nn-plug-in-with-tensorflow-lite-backend) instead.
259
275
:::note
260
276
261
-
Then, go to [TensorFlow interface in Rust chapter](../develop/rust/wasinn/tf_plugin.md) to see how to run `WasmEdge-TensorFlowLite` functions.
277
+
The wasmEdge-TensorFlowLite plug-in can help developers to perform TensorFlow-Lite model inference. To install this plug-in, please use the `--plugins wasmedge_tensorflowlite` parameter when [running the installer command](#generic-linux-and-macos).
The shared library will be extracted in the current directory `./libtensorflowlite_c.so` (or `.dylib` for MacOS) and `./libtensorflowlite_flex.so` (after the `WasmEdge 0.13.0` version). You can move the library to the installation path:
356
+
357
+
```bash
358
+
# If you installed wasmedge locally as above
359
+
mv libtensorflowlite_c.so ~/.wasmedge/lib
360
+
mv libtensorflowlite_flex.so ~/.wasmedge/lib
361
+
362
+
# Or, if you installed wasmedge for all users in /usr/local/
363
+
mv libtensorflowlite_c.so /usr/local/lib
364
+
mv libtensorflowlite_flex.so /usr/local/lib
365
+
366
+
# Or on MacOS platforms
367
+
mv libtensorflowlite_c.dylib ~/.wasmedge/lib
368
+
mv libtensorflowlite_flex.dylib ~/.wasmedge/lib
369
+
```
370
+
319
371
### TensorFlow Dependencies
320
372
321
373
If you install the `WasmEdge-Tensorflow` plug-in WITHOUT installer, you can download the shared libraries with the following commands:
The shared library will be extracted in the current directory `./libtensorflowlite_c.so` (or `.dylib` for MacOS) and `./libtensorflowlite_flex.so` (after the `WasmEdge 0.13.0` version). You can move the library to the installation path:
380
-
381
-
```bash
382
-
# If you installed wasmedge locally as above
383
-
mv libtensorflowlite_c.so ~/.wasmedge/lib
384
-
mv libtensorflowlite_flex.so ~/.wasmedge/lib
385
-
386
-
# Or, if you installed wasmedge for all users in /usr/local/
387
-
mv libtensorflowlite_c.so /usr/local/lib
388
-
mv libtensorflowlite_flex.so /usr/local/lib
389
-
390
-
# Or on MacOS platforms
391
-
mv libtensorflowlite_c.dylib ~/.wasmedge/lib
392
-
mv libtensorflowlite_flex.dylib ~/.wasmedge/lib
393
-
```
394
-
395
415
## Troubleshooting
396
416
397
417
Some users, especially in China, reported encountering the Connection refused error when trying to download the `install.sh` from the `githubusercontent.com`.
0 commit comments