Skip to content
Discussion options

You must be logged in to vote

@Bourne-alt Thank you for looking into Spezi LLM!

If you want to execute a model locally, I would advise you to take a look at the SpeziLLMLocal. It allows you to execute models locally on your device and we demonstrate it as part of the UI Testing application in this repo using some open-weight models.

If the model size is quite large (which it is for all the models we are using), I would suggest looking into SpeziLLMLocalDownload as an additional context and potential to download it at start time to your app.

I can encourage you to look through the README and documentation to explore these two targets in SpeziLLM, they might be helpful. I addition, we more than welcome PRs to this packa…

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by PSchmiedmayer
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
2 participants
Converted from issue

This discussion was converted from issue #55 on June 29, 2024 23:27.