Replies: 2 comments
-
That's a very good idea 💡 |
Beta Was this translation helpful? Give feedback.
0 replies
-
I got couple of answers at new session https://learn.deeplearning.ai/langchain-chat-with-your-data Thanks lot for educating people with very good session . |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
We are trying to fetch the context and answers from LLM as below
Some time this api retrieve large context which exceed the number of tokens .
I am using the in memory docarray db for context search
I know we can limit the token by manual running the query resultant documents and iterating over documents and combine them and before combine we can check length of tokens .
Is there any way I can limit the by some chain apis . also can how can I switch over strategy of chain_type on the go as different document need different strategies .
Langchain is mind blowing framework simplifying the LLM work . Thanks lot for such framework .
I would really appreciate for any pointers to my questions .
Beta Was this translation helpful? Give feedback.
All reactions