Foundry Friday AMA · Aug 08, 2025 · On-Device & Local AI #108
Unanswered
nitya
asked this question in
Ask Me Anything (AMA)
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
This is part of the #ModelMondays series where we put the spotlight on a new model-related topic each week.
🌟🌟 See #54 for the full Foundry Fridays AMA schedule 🌟🌟
Event Details
Want to run AI models on your own hardware for privacy, speed, or edge computing? This session spotlights Foundry Local, a solution built on ONNX Runtime for CPUs, NPUs, and GPUs. Maanav Dalal will share how to go from prototype to production with on-device inference, manage models locally, and leverage the flexibility and cost savings of local AI.
Related Resources
Beta Was this translation helpful? Give feedback.
All reactions