Custom Embedding Model and LLM #9169
Replies: 2 comments
-
🤖 Hi @aniketmoha9, great to see you again! I hope you're doing well. Let's take a look at your question. To use your local LLM and local embedding model in any LlamaHub pack, you need to instantiate them and pass them to a from llama_index import ServiceContext
# For local LLM
service_context_llm = ServiceContext.from_defaults(llm="local")
# For local embedding model
service_context_embedding = ServiceContext.from_defaults(embedding_model="local") In the above code, However, the context provided does not specify how to use specific models like mistral, zephy, or bge. Therefore, the exact way to use these models in LlamaIndex is not clear from the given context. For more specific instructions, you should refer to the documentation or usage guide of these models. If they are compatible with LlamaIndex, you should be able to use them in a similar way as shown in the code above. For more information, you can refer to these similar issues:
If you need further assistance, please provide more details about the models you're trying to use and any error messages you're encountering. Sources
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Beta Was this translation helpful? Give feedback.
-
The bot is nearly correct -- yea, you just set things in the service context, or pass in LLMs directly when needed All LLMs supported Some open-source LLM notebooks All embeddings supported |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
How can I specify my local llm and local embedding model in any llamahub pack? I want use mistral or zephy as my llm model, and bge embedding model.
Beta Was this translation helpful? Give feedback.
All reactions