Can DocumentSummaryIndex be used with local langchain llm? #9780
-
Been trying to see if I can set up a DocumentSummaryIndex but I keep getting an error message saying I need to set the OpenAI API key but I am passing in a LangChainLLM to my service context. Does the DocumentSummaryIndex only work with OpenAI right now? Code to show you what I have set up, its possible I have a bug I am just not seeing lol.
|
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
🤖 Hello @GeorgeDittmar! I'm Dosu, a friendly bot here to assist you with your LlamaIndex queries or issues while we wait for a human maintainer to join us. I'm designed to help you solve bugs, answer questions, and guide you on how to become a contributor. Looking forward to our interaction! Based on the information you've provided and the similar issues I found in the LlamaIndex repository, it seems like the error message about setting the OpenAI API key is likely because the OpenAI LLM is being used somewhere in your pipeline, possibly as a default setting. To use a local LangChainLLM, you need to instantiate a ServiceContext with the local LLM. If you are still receiving an error about the OpenAI API key, it's possible that another part of your pipeline is still using the OpenAI LLM. You should ensure that all parts of your pipeline are correctly configured to use the local LLM. If you are using the from llama_index.llms import LangChainLLM
from llama_index import ServiceContext
llm = LangChainLLM() # replace with your local LLM initialization
service_context = ServiceContext(llm=llm) Please replace Also, based on the issues I found, it seems that setting a global service context at the beginning of your code might help: from llama_index import set_global_service_context
set_global_service_context(service_context) This should prevent the function from creating a ServiceContext using defaults and thus avoid the OpenAI key missing error. Here are the similar issues I found that might help you:
I hope this helps! If you have any other questions or need further clarification, feel free to ask. Sources
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Beta Was this translation helpful? Give feedback.
🤖
Hello @GeorgeDittmar! I'm Dosu, a friendly bot here to assist you with your LlamaIndex queries or issues while we wait for a human maintainer to join us. I'm designed to help you solve bugs, answer questions, and guide you on how to become a contributor. Looking forward to our interaction!
Based on the information you've provided and the similar issues I found in the LlamaIndex repository, it seems like the error message about setting the OpenAI API key is likely because the OpenAI LLM is being used somewhere in your pipeline, possibly as a default setting.
To use a local LangChainLLM, you need to instantiate a ServiceContext with the local LLM. If you are still receiving an error about…