Skip to content

Commit 49baf61

Browse files
committed
Merge branch 'main' into improve-directory-setup
2 parents fe4dfa7 + 42aa39a commit 49baf61

File tree

7 files changed

+15
-13
lines changed

7 files changed

+15
-13
lines changed

Project.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@ RuntimeGeneratedFunctions = "7e49a35a-f44a-4d26-94aa-eba1b4ca6b47"
1111
Symbolics = "0c5d862f-8b57-4792-8d23-62f2024744c7"
1212

1313
[compat]
14-
AbstractNeuralNetworks = "0.3, 0.4"
14+
AbstractNeuralNetworks = "0.3, 0.4, 0.5"
1515
Documenter = "1.8.0"
1616
ForwardDiff = "0.10.38"
1717
GeometricMachineLearning = "0.3.7"

docs/src/hamiltonian_neural_network.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ Here we build a Hamiltonian neural network as a symbolic neural network.
55
```julia hnn
66
using SymbolicNeuralNetworks
77
using GeometricMachineLearning
8-
using AbstractNeuralNetworks: Dense, initialparameters, UnknownArchitecture, Model
8+
using AbstractNeuralNetworks: Dense, UnknownArchitecture, Model
99
using LinearAlgebra: norm
1010
using ChainRulesCore
1111
using KernelAbstractions
@@ -45,7 +45,7 @@ nothing # hide
4545
We can now train the network:
4646

4747
```julia hnn
48-
ps = NeuralNetworkParameters(initialparameters(c, T))
48+
ps = NeuralNetwork(c, T).params
4949
dl = DataLoader(z_data, hvf_analytic(z_data))
5050
o = Optimizer(AdamOptimizer(.01), ps)
5151
batch = Batch(200)

docs/src/symbolic_neural_networks.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ We first call the symbolic neural network that only consists of one layer:
66

77
```@example snn
88
using SymbolicNeuralNetworks
9-
using AbstractNeuralNetworks: Chain, Dense, initialparameters, params
9+
using AbstractNeuralNetworks: Chain, Dense, params
1010
1111
input_dim = 2
1212
output_dim = 1

src/build_function/build_function_arrays.jl

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -83,7 +83,7 @@ Return an executable function for each entry in `eqs`. This still has to be proc
8383
8484
```jldoctest
8585
using SymbolicNeuralNetworks: function_valued_parameters, SymbolicNeuralNetwork
86-
using AbstractNeuralNetworks: Chain, Dense, initialparameters, NeuralNetworkParameters, params
86+
using AbstractNeuralNetworks: Chain, Dense,fffff, NeuralNetworkParameters, params
8787
import Random
8888
Random.seed!(123)
8989

src/derivatives/jacobian.jl

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -45,7 +45,7 @@ We can use `Jacobian` together with [`build_nn_function`](@ref):
4545
```jldoctest
4646
using SymbolicNeuralNetworks
4747
using SymbolicNeuralNetworks: Jacobian, derivative
48-
using AbstractNeuralNetworks: Dense, Chain, initialparameters
48+
using AbstractNeuralNetworks: Dense, Chain, NeuralNetwork
4949
using Symbolics
5050
import Random
5151
@@ -59,7 +59,7 @@ nn = SymbolicNeuralNetwork(c)
5959
□ = SymbolicNeuralNetworks.Jacobian(nn)
6060
# here we need to access the derivative and convert it into a function
6161
jacobian1 = build_nn_function(derivative(□), nn)
62-
ps = initialparameters(c, Float64)
62+
ps = NeuralNetwork(c, Float64).params
6363
input = rand(input_dim)
6464
#derivative
6565
Dtanh(x::Real) = 4 * exp(2 * x) / (1 + exp(2x)) ^ 2

src/derivatives/pullback.jl

Lines changed: 6 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -8,15 +8,17 @@
88
```jldoctest
99
using SymbolicNeuralNetworks
1010
using AbstractNeuralNetworks
11+
using AbstractNeuralNetworks: params
1112
import Random
1213
Random.seed!(123)
1314
1415
c = Chain(Dense(2, 1, tanh))
15-
nn = SymbolicNeuralNetwork(c)
16+
nn = NeuralNetwork(c)
17+
snn = SymbolicNeuralNetwork(nn)
1618
loss = FeedForwardLoss()
17-
pb = SymbolicPullback(nn, loss)
18-
ps = initialparameters(c) |> NeuralNetworkParameters
19-
pb_values = pb(ps, nn.model, (rand(2), rand(1)))[2](1) |> typeof
19+
pb = SymbolicPullback(snn, loss)
20+
ps = params(nn)
21+
typeof(pb(ps, nn.model, (rand(2), rand(1)))[2](1))
2022
2123
# output
2224

test/derivatives/jacobian.jl

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
using Test, SymbolicNeuralNetworks
22
using SymbolicNeuralNetworks: Jacobian, derivative
3-
using AbstractNeuralNetworks: Chain, Dense, initialparameters, NeuralNetworkParameters
3+
using AbstractNeuralNetworks: Chain, Dense, NeuralNetwork
44
using LinearAlgebra: norm
55
import Symbolics, Random, ForwardDiff
66

@@ -26,7 +26,7 @@ function test_jacobian(n::Integer, T = Float32)
2626
nn = SymbolicNeuralNetwork(c)
2727
g = Jacobian(nn)
2828

29-
params = initialparameters(c, T) |> NeuralNetworkParameters
29+
params = NeuralNetwork(c, T).params
3030
input = rand(T, n)
3131
@test build_nn_function(g.output, nn)(input, params) c(input, params)
3232
@test build_nn_function(derivative(g), nn)(input, params) ForwardDiff.jacobian(input -> c(input, params), input)

0 commit comments

Comments
 (0)