Very Impressive! #46
MentalGear
started this conversation in
General
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Just tried out the 1.5b model via ollama and I'm super impressed by its speed and fluency!
I tried previously a 7b model which only generated about a token every 3 seconds on my testing system (intel mac), so this is a huge improvement.
Btw: Maybe consider adding a bluesky account - wanted to send some praise your way, but not supporting X/Twitter anymore due to obvious fascism.
Beta Was this translation helpful? Give feedback.
All reactions