Skip to content

Releases: withcatai/node-llama-cpp

v2.8.4

20 Jan 01:18
57d83a2
Compare
Choose a tag to compare

2.8.4 (2024-01-20)

v2.8.3

18 Dec 20:23
ba975e9
Compare
Choose a tag to compare

2.8.3 (2023-12-18)

Bug Fixes

v2.8.2

09 Dec 22:38
595a6bc
Compare
Choose a tag to compare

2.8.2 (2023-12-09)

Bug Fixes

  • adapt to breaking changes of llama.cpp (#117) (595a6bc)

v2.8.1

06 Dec 12:50
ceb538d
Compare
Choose a tag to compare

2.8.1 (2023-12-06)

v3.0.0-beta.1

26 Nov 19:39
4757af8
Compare
Choose a tag to compare
v3.0.0-beta.1 Pre-release
Pre-release

3.0.0-beta.1 (2023-11-26)

Features

BREAKING CHANGES

  • completely new API (docs will be updated before a stable version is released)

v2.8.0

06 Nov 18:46
190ef96
Compare
Choose a tag to compare

2.8.0 (2023-11-06)

Features

v2.7.5

05 Nov 23:49
8f29277
Compare
Choose a tag to compare

2.7.5 (2023-11-05)

v2.7.4

25 Oct 22:55
ff1644d
Compare
Choose a tag to compare

2.7.4 (2023-10-25)

Bug Fixes

  • do not download redundant node headers (#80) (ff1644d)
  • improve cmake custom options handling (#80) (ff1644d)
  • do not set CMAKE_GENERATOR_TOOLSET for CUDA (#80) (ff1644d)
  • do not fetch information from GitHub when using a local git bundle (#80) (ff1644d)
  • GBNF JSON schema string const formatting (#80) (ff1644d)

Features

  • adapt to the latest llama.cpp interface (#80) (ff1644d)
  • print helpful information to help resolve issues when they happen (#80) (ff1644d)
  • make portable cmake on Windows more stable (#80) (ff1644d)
  • update CMakeLists.txt to match llama.cpp better (#80) (ff1644d)

v2.7.3

13 Oct 13:59
1cba701
Compare
Choose a tag to compare

2.7.3 (2023-10-13)

v2.7.2

12 Oct 22:45
dc88531
Compare
Choose a tag to compare

2.7.2 (2023-10-12)

Features

  • minor: save and load history to chat command (#71) (dc88531)