Skip to content

Commit 7bc6dba

Browse files
committed
Filter out the warnings
1 parent e4220e5 commit 7bc6dba

File tree

3 files changed

+35
-6
lines changed

3 files changed

+35
-6
lines changed

docs/Project.toml

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -3,6 +3,7 @@ DeepEquilibriumNetworks = "6748aba7-0e9b-415e-a410-ae3cc0ecb334"
33
Documenter = "e30172f5-a6a5-5a46-863b-614d45cd2de4"
44
DocumenterCitations = "daee34ce-89f3-4625-b898-19384cb65244"
55
LinearSolve = "7ed4a6bd-45f5-4d41-b270-4a48e9bafcae"
6+
LoggingExtras = "e6f89c97-d47a-5376-807f-9c37f3926c36"
67
Lux = "b2108857-7c20-44ae-9111-449ecde12c47"
78
LuxCUDA = "d0bbae9a-e099-4d5b-a835-1c6931763bda"
89
MLDataUtils = "cc2ba9b6-d476-5e6d-8eaf-a92d5412d41d"
@@ -20,6 +21,7 @@ DeepEquilibriumNetworks = "2"
2021
Documenter = "1"
2122
DocumenterCitations = "1"
2223
LinearSolve = "2"
24+
LoggingExtras = "1"
2325
Lux = "0.5"
2426
LuxCUDA = "0.3"
2527
MLDataUtils = "0.5"

docs/src/tutorials/basic_mnist_deq.md

Lines changed: 19 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ We will train a simple Deep Equilibrium Model on MNIST. First we load a few pack
44

55
```@example basic_mnist_deq
66
using DeepEquilibriumNetworks, SciMLSensitivity, Lux, NonlinearSolve, OrdinaryDiffEq,
7-
Statistics, Random, Optimisers, LuxCUDA, Zygote, LinearSolve
7+
Statistics, Random, Optimisers, LuxCUDA, Zygote, LinearSolve, LoggingExtras
88
using MLDatasets: MNIST
99
using MLDataUtils: LabelEnc, convertlabel, stratifiedobs, batchview
1010
@@ -20,6 +20,18 @@ const cdev = cpu_device()
2020
const gdev = gpu_device()
2121
```
2222

23+
SciMLBase introduced a warning instead of depwarn which pollutes the output. We can suppress
24+
it with the following logger
25+
26+
```@example basic_mnist_deq
27+
function remove_syms_warning(log_args)
28+
return log_args.message !=
29+
"The use of keyword arguments `syms`, `paramsyms` and `indepsym` for `SciMLFunction`s is deprecated. Pass `sys = SymbolCache(syms, paramsyms, indepsym)` instead."
30+
end
31+
32+
filtered_logger = ActiveFilteredLogger(remove_syms_warning, global_logger())
33+
```
34+
2335
We can now construct our dataloader.
2436

2537
```@example basic_mnist_deq
@@ -175,15 +187,19 @@ and end up using solvers like `Broyden`, but we can simply slap in any of the fa
175187
from NonlinearSolve.jl. Here we will use Newton-Krylov Method:
176188

177189
```@example basic_mnist_deq
178-
train_model(NewtonRaphson(; linsolve=KrylovJL_GMRES()), :regdeq)
190+
with_logger(filtered_logger) do
191+
train_model(NewtonRaphson(; linsolve=KrylovJL_GMRES()), :regdeq)
192+
end
179193
nothing # hide
180194
```
181195

182196
We can also train a continuous DEQ by passing in an ODE solver. Here we will use `VCAB3()`
183197
which tend to be quite fast for continuous Neural Network problems.
184198

185199
```@example basic_mnist_deq
186-
train_model(VCAB3(), :deq)
200+
with_logger(filtered_logger) do
201+
train_model(VCAB3(), :deq)
202+
end
187203
nothing # hide
188204
```
189205

docs/src/tutorials/reduced_dim_deq.md

Lines changed: 14 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ same MNIST example as before, but this time we will use a reduced state size.
66

77
```@example reduced_dim_mnist
88
using DeepEquilibriumNetworks, SciMLSensitivity, Lux, NonlinearSolve, OrdinaryDiffEq,
9-
Statistics, Random, Optimisers, LuxCUDA, Zygote, LinearSolve
9+
Statistics, Random, Optimisers, LuxCUDA, Zygote, LinearSolve, LoggingExtras
1010
using MLDatasets: MNIST
1111
using MLDataUtils: LabelEnc, convertlabel, stratifiedobs, batchview
1212
@@ -16,6 +16,13 @@ ENV["DATADEPS_ALWAYS_ACCEPT"] = true
1616
const cdev = cpu_device()
1717
const gdev = gpu_device()
1818
19+
function remove_syms_warning(log_args)
20+
return log_args.message !=
21+
"The use of keyword arguments `syms`, `paramsyms` and `indepsym` for `SciMLFunction`s is deprecated. Pass `sys = SymbolCache(syms, paramsyms, indepsym)` instead."
22+
end
23+
24+
filtered_logger = ActiveFilteredLogger(remove_syms_warning, global_logger())
25+
1926
function onehot(labels_raw)
2027
return convertlabel(LabelEnc.OneOfK, labels_raw, LabelEnc.NativeLabels(collect(0:9)))
2128
end
@@ -168,11 +175,15 @@ Now we can train our model. We can't use `:regdeq` here currently, but we will s
168175
in the future.
169176

170177
```@example reduced_dim_mnist
171-
train_model(NewtonRaphson(; linsolve=KrylovJL_GMRES()), :skipdeq)
178+
with_logger(filtered_logger) do
179+
train_model(NewtonRaphson(; linsolve=KrylovJL_GMRES()), :skipdeq)
180+
end
172181
nothing # hide
173182
```
174183

175184
```@example reduced_dim_mnist
176-
train_model(NewtonRaphson(; linsolve=KrylovJL_GMRES()), :deq)
185+
with_logger(filtered_logger) do
186+
train_model(NewtonRaphson(; linsolve=KrylovJL_GMRES()), :deq)
187+
end
177188
nothing # hide
178189
```

0 commit comments

Comments
 (0)