Replies: 2 comments
-
Also using a 12GB 3060 and using Phi4 with "OK" results. I had high hopes for gemma3:12b, but got a lot of responses back in German when the documents were clearly English and I'm not sure why. If I use the same model and copy the default paperless-AI prompt exactly and put them into open-webui I get English results from the same documents paperless-AI tagged with German names/tags. Ollama had some issues with Gemma so maybe I'll come back and try again in another month. |
Beta Was this translation helpful? Give feedback.
0 replies
-
From my experience, Phi4:14b has been the best so far. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
I'd like to ask you, which model you could recommend for processing documents on a locally hosted ollama instance. I have assigned a 3060 with 12GB RAM to my Ollama instance.
I migrated all of my documents from my old DMS into paperless and let Ollama with Phi4:latest do some recognition over night. The results were OK, but I think it could be better.
Beta Was this translation helpful? Give feedback.
All reactions