TypeChat with llama.cpp #83
abichinger
started this conversation in
Show and tell
Replies: 3 comments
-
Did you try to use the Restaurant example with Llama2, e.g. 13B? Any success on your side @abichinger ? |
Beta Was this translation helpful? Give feedback.
0 replies
-
Hi, I just tried it. Seems like the 13B model can't handle the restaurant example. Here is the output I got with
Have you tried the 70B model? |
Beta Was this translation helpful? Give feedback.
0 replies
-
I cannot run 70B locally and still need to test it against a cloud instance. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I wanted to try TypeChat without an API key.
Luckily, llama-cpp-python already offers a web server which can be used as a drop-in replacement for the OpenAI API. So it was relatively easy to implement.
Result
TypeChat with Llama
Beta Was this translation helpful? Give feedback.
All reactions