Releases: AD2605/llama.cpp
Releases · AD2605/llama.cpp
b5259
llama-chat : reset glmedge chat template (#13253) * reset glmedge chat template * fix glmedge chat template
b5184
ggml : fix trailing whitespaces (#0)
b5180
cmake : do not include ./src as public for libllama (#13062) * cmake : do not include ./src as public for libllama ggml-ci * cmake : rework tests ggml-ci * llguidance : remove unicode include ggml-ci * cmake : make c++17 private ggml-ci
b5064
cmake : enable curl by default (#12761) * cmake : enable curl by default * no curl if no examples * fix build * fix build-linux-cross * add windows-setup-curl * fix * shell * fix path * fix windows-latest-cmake* * run: include_directories * LLAMA_RUN_EXTRA_LIBS * sycl: no llama_curl * no test-arg-parser on windows * clarification * try riscv64 / arm64 * windows: include libcurl inside release binary * add msg * fix mac / ios / android build * will this fix xcode? * try clearing the cache * add bunch of licenses * revert clear cache * fix xcode * fix xcode (2) * fix typo