Replies: 1 comment
-
Same here! Within 10 minutes I was up and running, and I'm slow! I have a 2080 Super, and it produces nearly as fast as the major cloud AIs. Stunning. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I have a gaming laptop with a RADEON GPU and I had never managed to make it work for AI inference before.
Here it works great, I get blazing fast output from Llama3.
Well done.
Beta Was this translation helpful? Give feedback.
All reactions