Recommend application solutions for large models #737
Unanswered
Caojia520Suiyng
asked this question in
General
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Is it faster to use Jetson Orin Super 8G for inference of QWEN models or Raspberry Pi external AMD RX6800XT for inference speed, and deploy large models externally using Orama? Please give some suggestions. Thank you very much
Beta Was this translation helpful? Give feedback.
All reactions