Unable to use llamaIndex after openAI update #8896
Replies: 3 comments 1 reply
-
🤖 I see that you're having trouble with the LlamaIndex's subQueryEngineTool after an update from OpenAI. The error message you're seeing suggests that the As per the dependencies of LlamaIndex v0.8.68, it requires OpenAI version >=1.1.0. This could be the reason why downgrading the OpenAI version is causing other areas of LlamaIndex to stop working. For the second error, it seems like there's a typo in the URL 'https:/raw.githubusercontent.com/run-llama/llama-hub/main/llama_hub/library.json'. It should be 'https://raw.githubusercontent.com/run-llama/llama-hub/main/llama_hub/library.json'. The 'https://' protocol is missing an extra slash. To resolve these issues, I would suggest the following steps:
Please try these steps and let me know if the issues persist. SourcesThis response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Beta Was this translation helpful? Give feedback.
-
I have the same issue. @NitishDixit have you found a solution? |
Beta Was this translation helpful? Give feedback.
-
I think llamaIndex needs to be migrated with openai/openai-python#742 |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hello,
I have a strange situation here, I was using LlamaIndex's subQueryEngineTool to build a RAG anf it was working fine until an update from openAI, now when I try to do the following:
s_engine = SubQuestionQueryEngine.from_defaults(query_engine_tools=query_engine_tools, use_async=False) response = s_engine.query("Question about document")
I get the following error: "you tried to access openai.chatcompletion, but this is no longer supported in openai>=1.0.0 llamaindex"
I tried downgrading openAI version but then other areas of LlamIndex stop working, can someone plz help me with how can this be fixed?
Additionally, I tried using llama_index==0.8.62 but then it starts throwing error on the following code itself:
from llama_index.readers.download import download_loader from pathlib import Path PDFReader = download_loader("PDFReader")
here is the error: InvalidURL: Invalid URL 'https:/raw.githubusercontent.com/run-llama/llama-hub/main/llama_hub/library.json': No host supplied
Thanks
Beta Was this translation helpful? Give feedback.
All reactions