Replies: 1 comment
-
you can call the model you want in the openAI, just add, |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
What is the default model for openAI instance?
I am playing around with langchain / vector embedding. Right now, I am using RetrievalQAChain to answer questions from an embedded document(s).
const qaChain = RetrievalQAChain.fromLLM(llm, vectorStore.asRetriever())
Here is how I innit an openAI instance
When looking at the Usage page in the OpenAI dashboard I saw there are calls used
text-davinci
, which I can't find on openAI pricing page for cost calculation purpose.Beta Was this translation helpful? Give feedback.
All reactions