Skip to content

Automatic differentiation of composition of variables #2012

Answered by LordPontiag
ululi1970 asked this question in Q&A
Discussion options

You must be logged in to vote

Consider this snippet of code

def pde(x,y):
    u_x = dde.grad.jacobian(y,x, i=0, j=0)
    a = y[:,1:2]
    C= a * u_x
    C_x = dde.grad.jacobian(C. x, j = 0)
 # do something with C

Will C_x evaluate to a_x * u_x + a * u_xx or do I need to apply the jacobian to a, calculate the jacobian and hessian of u and apply the chain rule explicitly? Thank you for your answers.

C_x will evaluate to a_x * u_x + a * u_xx (or the equivalent in higher dimensions) because DeepXDE uses automatic differentiation (via the backend like TensorFlow) to handle the chain rule and product rule implicitly when computing the Jacobian of the composed tensor C. You do not need to manually apply the chain rule or …

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by ululi1970
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants