Replies: 1 comment
-
Hi! Yes! You can do the below
|
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hello! Is there a way to "freeze" the ability of the optimizer to optimize parameters of the GP? In particular, you could imagine we fit the GP to data then want to use that model as part of a larger model in
jax
. But when we optimize the larger model, we want to keep the parameters of the GP "fixed".Is there a performant way to do this in GPJax?
In for example, in
BayesNewton
the constructor for a kernel takes inputs calledfix_variance, fix_lengthscale
, etc. that tells the model if those parameters are to be included in gradient calculations.Thanks!
Beta Was this translation helpful? Give feedback.
All reactions