Replies: 1 comment 2 replies
-
Hey there, @bing0037! I found a similar solved discussion that might be helpful: Issue with LangChain Misclassifying gpt-3.5-turbo-instruct as Chat Model. The solution suggested there is to upgrade to a newer version of LangChain, as the model handling logic has been updated in later versions [1]. Regarding your specific question about using other models with
These configurations allow you to use different models effectively with |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Checked other resources
Commit to Help
Example Code
Description
The default OpenAI LLM (
llm = OpenAI(temperature=0)
is 'gpt-3.5-turbo-instruct', but I can't access 'gpt-3.5-turbo-instruct' for some reason and used 'gpt-4' instead. This leads to error.Is there any other models that can be used for ContextualCompressionRetriever?
Here is the error:
System Info
System Information
Package Information
Optional packages not installed
Other Dependencies
Beta Was this translation helpful? Give feedback.
All reactions