Best Local LLM for German Language? #17
Replies: 3 comments 5 replies
-
I found many to perform quite badly on German, especially on tagging. But there seems to be some movement towards optimized models for German: https://ollama.com/cas/discolm-mfto-german |
Beta Was this translation helpful? Give feedback.
-
Wow this discussion is what is exactly what i was wondering. I have a 3070 8GB VRAM available. Was wondering if i need more VRAM to be more precise🫠 this ones are kind of okay:
I tried all of these: (German, translation below)
_Translated with chatgpt:
|
Beta Was this translation helpful? Give feedback.
-
I now made some more tests and the qwen2.5-14b in k5 quant is realtively good. this uses more than only the gpu and is relatively slow, but okay with 32 gb ram ( ihave 64gb installed) i tested these all:
|
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Which LLM do you think is the best for performing classifications locally with Ollama? (It should run with a maximum of 24GB VRAM).
Beta Was this translation helpful? Give feedback.
All reactions