Skip to content

Issue with LlamaParse ... #627

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
salahuddinfa opened this issue Feb 22, 2025 · 1 comment
Open

Issue with LlamaParse ... #627

salahuddinfa opened this issue Feb 22, 2025 · 1 comment
Labels
bug Something isn't working

Comments

@salahuddinfa
Copy link

salahuddinfa commented Feb 22, 2025

Describe the bug
when using LVM mode with gemini-flash-2.0, getting NO_CONTENT_HERE in some pages

Files
Can't do it here

Job ID
9e284cfb-4f2d-4f54-83b6-c43f55e7fc2b

Client:

  • Frontend (cloud.llamaindex.ai)

Additional context
I used LVM mode with gemini-flash-2.0, other than that I didn't change any settings or used any prompts. I only used it in LlamaCloud, need to test using API yet

@salahuddinfa salahuddinfa added the bug Something isn't working label Feb 22, 2025
@arunpkm
Copy link

arunpkm commented Apr 5, 2025

@salahuddinfa Any update on this issue. I am also encountering this issue often.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants