Skip to content

Adjust docs & Flux.@functor for Functors.jl v0.5, plus misc. depwarns #2509

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 5 commits into from
Dec 3, 2024
Merged
Changes from 3 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
45 changes: 38 additions & 7 deletions src/deprecations.jl
Original file line number Diff line number Diff line change
Expand Up @@ -83,16 +83,48 @@ function params!(p::Zygote.Params, x, seen = IdSet())
end
end

"""
params(model)

Returns a `Zygote.Params` object containing all parameter arrays from the model.
This is deprecated!

This function was the cornerstone of how Flux used Zygote's implicit mode gradients,
but since Flux 0.13 we use explicit mode `gradient(m -> loss(m, x, y), model)` instead.

To collect all the parameter arrays for other purposes, use `Flux.trainables(model)`.
"""
function params(m...)
Base.depwarn("""
Flux.params(m...) is deprecated. Use `Flux.trainable(model)` for parameters' collection
and the explicit `gradient(m -> loss(m, x, y), model)` for gradient computation.
""", :params)
@warn """`Flux.params(m...)` is deprecated. Use `Flux.trainable(model)` for parameter collection,
and the explicit `gradient(m -> loss(m, x, y), model)` for gradient computation.""" maxlog=1
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Base.depwarn is silent except in tests. IMO that's ideal if you are marking something deprecated before a breaking change, when the replacement is available.

However, my impression is that we really want people to change this, not to silently live with old code during 0.15. So I'd like something to be printed in interactive use.

Maybe the same goes for more of the depwarns in this file?

ps = Params()
params!(ps, m)
return ps
end


"""
@functor MyLayer

Flux used to require the use of `Functors.@functor` to mark any new layer-like struct.
This allowed it to explore inside the struct, and update any trainable parameters within.
Flux@0.15 removes this requirement. This is because Functors@0.5 changed ist behaviour
to be opt-out instead of opt-in. Arbitrary structs will now be explored without special marking.
Hence calling `@functor` is no longer required.

Calling `Flux.@layer MyLayer` is, however, still recommended. This adds various convenience methods
for your layer type, such as pretty printing, and use with Adapt.jl.
"""
macro functor(ex)
@warn """The use of `Flux.@functor` is deprecated.
Most likely, you should write `Flux.@layer MyLayer` which will add various convenience methods for your type,
such as pretty-printing, and use with Adapt.jl.
However, this is not required. Flux.jl v0.15 uses Functors.jl v0.5, which makes exploration of most nested `struct`s
opt-out instead of opt-in... so Flux will automatically see inside any custom struct definitions.
""" maxlog=1
_layer_macro(ex)
end

# Allows caching of the parameters when params is called within gradient() to fix #2040.
# @non_differentiable params(m...) # https://github.com/FluxML/Flux.jl/pull/2054
# That speeds up implicit use, and silently breaks explicit use.
Expand All @@ -101,6 +133,8 @@ Zygote._pullback(::Zygote.Context{true}, ::typeof(params), m...) = params(m), _

include("optimise/Optimise.jl") ## deprecated Module

Base.@deprecate_binding Optimiser OptimiserChain
Base.@deprecate_binding ClipValue ClipGrad

# TODO this friendly error should go in Optimisers.jl.
# remove after https://github.com/FluxML/Optimisers.jl/pull/181
Expand All @@ -119,9 +153,6 @@ end
### v0.16 deprecations ####################


# Enable these when 0.16 is released, and delete const ClipGrad = Optimise.ClipValue etc:
# Base.@deprecate_binding Optimiser OptimiserChain
# Base.@deprecate_binding ClipValue ClipGrad
Comment on lines -122 to -124
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These const definitions have already been deleted:

julia> Flux.ClipValue
ERROR: UndefVarError: `ClipValue` not defined
Stacktrace:
 [1] getproperty(x::Module, f::Symbol)
   @ Base ./Base.jl:31
 [2] top-level scope
   @ REPL[6]:1

julia> Flux.Optimiser
ERROR: UndefVarError: `Optimiser` not defined
Stacktrace:
 [1] getproperty(x::Module, f::Symbol)
   @ Base ./Base.jl:31
 [2] top-level scope
   @ REPL[7]:1


# train!(loss::Function, ps::Zygote.Params, data, opt) = throw(ArgumentError(
# """On Flux 0.16, `train!` no longer accepts implicit `Zygote.Params`.
Expand Down
Loading