Skip to content

Metagradients #17247

Answered by jakevdp
samskiter asked this question in General
Aug 23, 2023 · 1 comments · 4 replies
Discussion options

You must be logged in to vote

In general JAX transformations are composable, so you can use autodiff to compute the gradient of another function that uses autodiff.

That said, there may be some limitations: for example, JAX does not implement reverse-mode autodiff of unbounded loops, so if your inner solver uses a while_loop for convergence, you'll be limited to forward-mode autodiff.

Replies: 1 comment 4 replies

Comment options

You must be logged in to vote
4 replies
@mattjj
Comment options

@mattjj
Comment options

@samskiter
Comment options

@samskiter
Comment options

Answer selected by samskiter
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
3 participants