Skip to content

Model's gradient into loss function #1753

@ghost

Description

I would like to include the model's gradient to force monotonicity for some features during training. However, when I train the model, I get Can't differentiate foreigncall expression Here is my minimal example:

using Flux
actual(x) = 4x+2

x_train, x_test = hcat(0:5...), hcat(6:10...)
y_train, y_test = actual.(x_train), actual.(x_test)

predict = Dense(1, 1)

opt = Descent()

data = [(x_train, y_train)]

parameters = params(predict)

function custom_loss(x, y)
           fitloss = Flux.Losses.mse(predict(x), y)
           derivativeloss = gradient(x -> sum(predict(x)), x)[1][1]
           return fitloss + derivativeloss
       end

custom_loss(x_train, y_train) #works

train!(custom_loss, parameters, data, opt) #Can't differentiate foreigncall expression

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions