NABLA - A tiny JAX-like library written in Mojo #28762
Unanswered
TilliFe
asked this question in
Show and tell
Replies: 1 comment
-
This is great, thx for the comments here!: nabla-ml/nabla#12 Feel free to use the discussion section of Nabla for any further discussions; let's keep Nabla cleanly separated from JAX! ;): https://github.com/orgs/nabla-ml/discussions |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I'd like to share a project I've been working on these past few months that was directly inspired by JAX's approach to automatic differentiation. As someone who's always appreciated JAX's functional design philosophy for ML problems, I wanted to bring similar capabilities to Mojo, the new Python-like language designed to deliver performance comparable to systems languages like Rust.
Today I'm announcing a research preview of NABLA - a framework for differentiable programming in Mojo. Nabla aims to bring to Mojo what JAX and PyTorch brought to Python: a high-level API for common program transformations, including vmap, jit, vjp, jvp & grad.
Nabla was designed as a direct wrapper around Mojo and MAX (an XLA-like ML compiler ecosystem led by Chris Lattner) to leverage their performance characteristics. Rather than rebuilding the entire stack, Nabla focuses on providing transformations while relying on the underlying ecosystem for execution. The core AD engine has proven effective in initial testing, though there are still areas that need development (operator coverage, GPU support, etc.).
Feedback from the JAX community would be especially valuable as this project develops further, given how much NABLA's design philosophy owes to JAX's influence.
Code, examples and roadmap: github.com/nabla-ml/nabla
Documentation and homepage: nablaml.com
Follow updates on X: @nablaml
Beta Was this translation helpful? Give feedback.
All reactions