Skip to content

0.1.86

Compare
Choose a tag to compare
@MarcusDunn MarcusDunn released this 02 Jan 23:56
· 276 commits to main since this release
9b6bccb

What's Changed

  • Include llama.cpp/ggml/src/ggml-metal when publishing llama-cpp-sys-2 by @babichjacob in #597
  • Bumped version to 0.1.86 by @github-actions in #595
  • chore(deps): bump clap from 4.5.19 to 4.5.22 by @dependabot in #593
  • Add sampling API back to LlamaTokenDataArray; Add DRY and XTC Samplers by @nkoppel in #594
  • chore(deps): bump anyhow from 1.0.93 to 1.0.94 by @dependabot in #599
  • chore(deps): bump encoding_rs from 0.8.34 to 0.8.35 by @dependabot in #600
  • chore(deps): bump cc from 1.2.2 to 1.2.3 by @dependabot in #601
  • chore(deps): bump clap from 4.5.22 to 4.5.23 by @dependabot in #602
  • Fix arguments of sampling methods; Make chat templates and detokenization more reliable by @nkoppel in #611
  • chore(deps): bump anyhow from 1.0.94 to 1.0.95 by @dependabot in #606
  • chore(deps): bump glob from 0.3.1 to 0.3.2 by @dependabot in #609
  • chore(deps): bump docker/setup-buildx-action from 3.7.1 to 3.8.0 by @dependabot in #608
  • Build for aarch64-linux-android by @AsbjornOlling in #605
  • chore(deps): bump cc from 1.2.3 to 1.2.6 by @dependabot in #610

New Contributors

Full Changelog: 0.1.85...0.1.86