-
-
Notifications
You must be signed in to change notification settings - Fork 611
Open
Labels

Description
I would like to include the model's gradient to force monotonicity for some features during training. However, when I train the model, I get Can't differentiate foreigncall expression
Here is my minimal example:
using Flux
actual(x) = 4x+2
x_train, x_test = hcat(0:5...), hcat(6:10...)
y_train, y_test = actual.(x_train), actual.(x_test)
predict = Dense(1, 1)
opt = Descent()
data = [(x_train, y_train)]
parameters = params(predict)
function custom_loss(x, y)
fitloss = Flux.Losses.mse(predict(x), y)
derivativeloss = gradient(x -> sum(predict(x)), x)[1][1]
return fitloss + derivativeloss
end
custom_loss(x_train, y_train) #works
train!(custom_loss, parameters, data, opt) #Can't differentiate foreigncall expression
emmanuellujan and NoFishLikeIan