How to turn a gguf format into a llamafile format. #642
Unanswered
circle-games
asked this question in
Q&A
Replies: 2 comments 2 replies
-
Please see https://github.com/Mozilla-Ocho/llamafile?tab=readme-ov-file#creating-llamafiles |
Beta Was this translation helpful? Give feedback.
2 replies
-
Thanks for the connection and support.
…________________________________
From: Yazan Agha-Schrader ***@***.***>
Sent: Friday, November 29, 2024 6:12 PM
To: Mozilla-Ocho/llamafile ***@***.***>
Cc: Subscribed ***@***.***>
Subject: Re: [Mozilla-Ocho/llamafile] How to turn a gguf format into a llamafile format. (Discussion #642)
the idea of llamafile is to run a model as a standalone, that means without depending on python or anything else. llamafile therefore is not a model but a bundle of files that are zipped together.
if you want to use your model with llamacpp-python, you have to point to the .gguf file, not to .llamafile
It is also possible to extract the model 'back' from the llamafile btw: #539 (comment)<#539 (comment)>
—
Reply to this email directly, view it on GitHub<#642 (reply in thread)>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/AOOFRCEML3YEWKSKL4DIFYT2DB77RAVCNFSM6AAAAABSP2ZZYKVHI2DSMVQWIX3LMV43URDJONRXK43TNFXW4Q3PNVWWK3TUHMYTCNBRGY4TKOI>.
You are receiving this because you are subscribed to this thread.Message ID: ***@***.***>
|
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I have a model in gguf format (link: https://huggingface.co/falan42/llama_lora_8b_medical_parallax_2_gguf/tree/main ) and I want to turn it into a llamafile format. it would be better if its on colab I would be thankful if you help.
Beta Was this translation helpful? Give feedback.
All reactions