Skip to content

llama.cpp and lora - forcing 16 bit or a merge is kind of defeating the purpose of a separate lora file ? #4317

cmp-nct started this conversation in General
Discussion options

You must be logged in to vote

Replies: 2 comments

Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
2 participants