Releases: withcatai/node-llama-cpp
v3.0.0-beta.9
3.0.0-beta.9 (2024-02-05)
Bug Fixes
- don't block a node process from exiting (#157) (74fb35c)
- respect
logLevel
andlogger
params when using"lastBuild"
(#157) (74fb35c) - print logs on the same event loop cycle (#157) (74fb35c)
Shipped with llama.cpp
release b2074
To use the latest
llama.cpp
release available, runnpx --no node-llama-cpp download --release latest
. (learn more)
v3.0.0-beta.8
3.0.0-beta.8 (2024-02-05)
Bug Fixes
Shipped with llama.cpp
release b2060
To use the latest
llama.cpp
release available, runnpx --no node-llama-cpp download --release latest
. (learn more)
v3.0.0-beta.7
3.0.0-beta.7 (2024-02-05)
Bug Fixes
Shipped with llama.cpp
release b2060
To use the latest
llama.cpp
release available, runnpx --no node-llama-cpp download --release latest
. (learn more)
v2.8.6
v3.0.0-beta.6
3.0.0-beta.6 (2024-02-04)
Bug Fixes
Features
- manual binding loading (#153) (0e4b8d2)
- log settings (#153) (0e4b8d2)
- ship CUDA prebuilt binaries (#153) (0e4b8d2)
Shipped with llama.cpp
release b2060
To use the latest
llama.cpp
release available, runnpx --no node-llama-cpp download --release latest
. (learn more)
v3.0.0-beta.5
3.0.0-beta.5 (2024-01-24)
Bug Fixes
Features
Shipped with llama.cpp
release b1961
To use the latest
llama.cpp
release available, runnpx --no node-llama-cpp download --release latest
. (learn more)
v3.0.0-beta.4
3.0.0-beta.4 (2024-01-21)
Features
Shipped with llama.cpp
release b1892
To use the latest
llama.cpp
release available, runnpx --no node-llama-cpp download --release latest
. (learn more)
v3.0.0-beta.3
3.0.0-beta.3 (2024-01-21)
Features
Shipped with llama.cpp
release b1892
To use the latest
llama.cpp
release available, runnpx --no node-llama-cpp download --release latest
. (learn more)
v2.8.5
v3.0.0-beta.2
3.0.0-beta.2 (2024-01-20)
Bug Fixes
- adapt to breaking changes of
llama.cpp
(#117) (595a6bc) - threads parameter (#139) (5fcdf9b)
- disable Metal for
x64
arch by default (#139) (5fcdf9b)
Features
- function calling (#139) (5fcdf9b)
- chat syntax aware context shifting (#139) (5fcdf9b)
- stateless
LlamaChat
(#139) (5fcdf9b) - improve chat wrapper (#139) (5fcdf9b)
LlamaText
util (#139) (5fcdf9b)- show
llama.cpp
release in GitHub releases (#142) (36c779d)
Shipped with llama.cpp
release b1892
To use the latest
llama.cpp
release available, runnpx --no node-llama-cpp download --release latest
(learn more)