Skip to content

Releases: AD2605/llama.cpp

b5259

02 May 10:21
2af6880
Compare
Choose a tag to compare
llama-chat : reset glmedge chat template (#13253)

* reset glmedge chat template

* fix glmedge chat template

b5184

24 Apr 16:27
Compare
Choose a tag to compare
ggml : fix trailing whitespaces (#0)

b5180

24 Apr 14:13
13b4548
Compare
Choose a tag to compare
cmake : do not include ./src as public for libllama (#13062)

* cmake : do not include ./src as public for libllama

ggml-ci

* cmake : rework tests

ggml-ci

* llguidance : remove unicode include

ggml-ci

* cmake : make c++17 private

ggml-ci

b5064

07 Apr 12:45
bd3f59f
Compare
Choose a tag to compare
cmake : enable curl by default (#12761)

* cmake : enable curl by default

* no curl if no examples

* fix build

* fix build-linux-cross

* add windows-setup-curl

* fix

* shell

* fix path

* fix windows-latest-cmake*

* run: include_directories

* LLAMA_RUN_EXTRA_LIBS

* sycl: no llama_curl

* no test-arg-parser on windows

* clarification

* try riscv64 / arm64

* windows: include libcurl inside release binary

* add msg

* fix mac / ios / android build

* will this fix xcode?

* try clearing the cache

* add bunch of licenses

* revert clear cache

* fix xcode

* fix xcode (2)

* fix typo