Need help with understanding gradient computation implementation in backward call in _inv_quad_logdet.py #2104
-
I want to implement a custom LazyTensor class with where However, the current implementation of Thanks in advance! |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 3 replies
-
This is correct. The |
Beta Was this translation helpful? Give feedback.
This is correct. The
_quad_form_derivative
method (and the whole LazyTensor abstraction) is designed to be as abstract as possible. This is because we want to be able to backpropagate through functions other than the log marginal likelihood of the GP (for example, we want to back propagate through predictions made with lazy t…