Add MLX support to localscore #779
Replies: 2 comments 2 replies
-
Thanks for the interest in LocalScore! I believe Llamafile itself will not have support for MLX as it's outside the scope of the project. However LocalScore is independent, the backend required to run the benchmarks. So if someone wants to write an MLX backend, and they conform to the API specification for LocalScore, they should be able to submit results to the website. Right now that spec is not well documented outside of the code itself. In addition to have it publicly available on the website I will need to tweak a few settings and the Website code + DB may need an upgrade to handle an additional backend. However it should be possible and relatively easy to do so. One of the dreams of LocalScore was to have third-party clients be able to submit results, be that LM Studio or Ollama or any client. It would be great to be able to have it submitted directly to LocalScore itself! and I would be happy to help support anyone in adding a new client to the website as well as testing it. It would probably be best to move this discussion over to the LocalScore project specifically as Llamafile is just one of the clients that LocalScore can support and Llamafile just has first class support for it |
Beta Was this translation helpful? Give feedback.
-
Moved to cjpais/LocalScore#20 |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi there! Thanks for writing llamafile and localscore! Both are awesome 🤌
Did a cursory search of the repo and didn't see any explicit mention of supporting mlx models. While @simonw has done some amazing work already, I like the existing gguf workflow that localscore implements for doing a bunch of comprehensive benchmarks with pretty output.
It could totally be user error on my part, but I don't think it's possible to use localscore with mlx models (e.g., qwen3-30b-a3b-dwq) so I'd have to use its gguf equivalent — if it exists.
Beta Was this translation helpful? Give feedback.
All reactions