Couldn't initialize an LLM managed by OpenLLM locally from current process #25243
Replies: 1 comment
-
Hey @bloomingseed! I'm here to help you with any bugs, questions, or contributions. Let's tackle this issue together! To resolve the Here is the relevant part of the code where runner = openllm.Runner(
model_name=model_name,
model_id=model_id,
init_local=embedded,
ensure_available=True,
**llm_kwargs,
) If you encounter the
If the If the pip install --upgrade openllm If the issue persists, you may need to consult the |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Checked other resources
Commit to Help
Example Code
Description
I couldn't initialize an LLM managed by OpenLLM locally from current process. Running the example code from the docs page on OpenLLM resulted in the following error:
System Info
System Information
Package Information
Optional packages not installed
Other Dependencies
openllm -v
outputBeta Was this translation helpful? Give feedback.
All reactions