[Bug]: LLM creates JSON with individual keys/field_names and cant be used #387
Replies: 2 comments 2 replies
-
This is not an issue by paperless and the solution or reason is in the logs you posted. That's completely up to the LLM you use. Closed as not a bug. |
Beta Was this translation helpful? Give feedback.
-
First of, thank you for the quite well support and active commenting here! That is not always a given! Maybe you can convert this into a discussion. I feel like this could also be a issue for other ollama users and maybe others can profit with similar experiences (even if this is not code-related). Will the mentioned PR get merged or do i have to get another solution (If this even clears my issue)? Would you mind explaining my question: In which way is the Prompt Description from the settings page (vs. the hard coded prompt in config.js) used? |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
🔍 Bug Summary
Using the standard prompt some documents entice the llm to build a json with other keys then title, correspondend and so on.
📖 Description
Even adding more information to the prompt to add only the provided keys doesnt change it for these documents.
Model:
gwen2.5:14b
My last tried prompt:
Probably structured output for ollama could fix this:
#381
🔄 Steps to Reproduce
✅ Expected Behavior
❌ Actual Behavior
📜 Docker Logs
📜 Paperless-ngx Logs
🖼️ Screenshots of your settings page
No response
🖥️ Desktop Environment
Windows
💻 OS Version
No response
🌐 Browser
None
🔢 Browser Version
No response
🌐 Mobile Browser
No response
📝 Additional Information
📌 Extra Notes
After looking into this pull request (#381) i wondered why the prompt is also written into https://github.com/clusterzx/paperless-ai/blob/main/config/config.js
In which way is the Prompt Description (maybe also misleading named) from the settings page used?
Beta Was this translation helpful? Give feedback.
All reactions