Skip to content

Commit 8f1d10e

Browse files
authored
Merge pull request #44 from OptimalTransportNetworks/development
Development
2 parents fc14b0b + 084febf commit 8f1d10e

File tree

6 files changed

+9
-5
lines changed

6 files changed

+9
-5
lines changed

NEWS.md

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,7 @@
1+
# 0.1.9
2+
3+
* The exact dual solution for the Hessian with `cross_good_congestion = true` does not have good numerical properties in some cases. Therefore, by default now an approximate solution is used which works better for most problems. Users can set `duality = 2` to use the exact solution in the CGC case.
4+
15
# 0.1.8
26

37
* Fixed dual solution with `cross_good_congestion = true`, and set the default `duality = true` in `init_parameters()`. It is highly recommended to keep `beta <= 1` in fixed labor cases to harness the dual solutions, which yield a tremendous speedup.

Project.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
name = "OptimalTransportNetworks"
22
uuid = "e2b46e68-897f-4e4e-ba36-a93c9789fd96"
33
authors = ["Sebastian Krantz <sebastian.krantz@graduateinstitute.ch>"]
4-
version = "0.1.8"
4+
version = "0.1.9"
55

66
[deps]
77
Dierckx = "39dd38d3-220a-591b-8e3c-4c3a8c710a94"

README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -31,7 +31,7 @@ This plot shows the optimal network after 200 iterations, keeping population fix
3131

3232
## Performance Notes
3333

34-
* The Julia implementation does not provide hard-coded Gradients, Jacobians, and Hessians as the MATLAB implementation does for some model cases, but relies solely on JuMP's automatic differentiation. One exception is dual solutions which are implemented directly using Ipopt and hard-coded sparse hessians - they are super fast. By default `duality = true`, and if labor is fixed and `beta <= 1`, users will experience very fast dual solves. I also expect non-dual solutions to speed up when [support for detecting nonlinear subexpressions](https://github.com/jump-dev/JuMP.jl/issues/3738) will be added to JuMP.
34+
* The Julia implementation for the most part does not provide hard-coded Gradients, Jacobians, and Hessians as the MATLAB implementation does for some model cases, but relies solely on JuMP's automatic differentiation. One exception is dual solutions which are implemented directly using Ipopt and hard-coded sparse Hessians - they are super fast. By default `duality = true`, and if labor is fixed and `beta <= 1`, users will experience very fast dual solves. I also expect non-dual solutions to speed up when [support for detecting nonlinear subexpressions](https://github.com/jump-dev/JuMP.jl/issues/3738) will be added to JuMP.
3535

3636
* Symbolic autodifferentiation via [MathOptSymbolicAD.jl](https://github.com/lanl-ansi/MathOptSymbolicAD.jl) can also provide significant performance improvements for non-dual cases. The symbolic backend can be activated using:
3737

docs/src/index.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@ This plot shows the optimal network after 200 iterations, keeping population fix
2626

2727
## Performance Notes
2828

29-
* The Julia implementation does not provide hard-coded Gradients, Jacobians, and Hessians as the MATLAB implementation does for some model cases, but relies solely on JuMP's automatic differentiation. One exception is dual solutions which are implemented directly using Ipopt and hard-coded sparse hessians - they are super fast. By default `duality = true`, and if labor is fixed and `beta <= 1`, users will experience very fast dual solves. I also expect non-dual solutions to speed up when [support for detecting nonlinear subexpressions](https://github.com/jump-dev/JuMP.jl/issues/3738) will be added to JuMP.
29+
* The Julia implementation for the most part does not provide hard-coded Gradients, Jacobians, and Hessians as the MATLAB implementation does for some model cases, but relies solely on JuMP's automatic differentiation. One exception is dual solutions which are implemented directly using Ipopt and hard-coded sparse Hessians - they are super fast. By default `duality = true`, and if labor is fixed and `beta <= 1`, users will experience very fast dual solves. I also expect non-dual solutions to speed up when [support for detecting nonlinear subexpressions](https://github.com/jump-dev/JuMP.jl/issues/3738) will be added to JuMP.
3030

3131
* Symbolic autodifferentiation via [MathOptSymbolicAD.jl](https://github.com/lanl-ansi/MathOptSymbolicAD.jl) can also provide significant performance improvements for non-dual cases. The symbolic backend can be activated using:
3232

src/main/init_parameters.jl

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@ Returns a `param` dict with the model parameters. These are independent of the g
2020
- `m::Vector{Float64}=ones(N)`: Vector of weights Nx1 in the cross congestion cost function
2121
- `annealing::Bool=true`: Switch for the use of annealing at the end of iterations (only if gamma > beta)
2222
- `verbose::Bool=true`: Switch to turn on/off text output (from Ipopt or other optimizers)
23-
- `duality::Bool=true`: Switch to turn on/off duality whenever available (fixed labor and beta <= 1)
23+
- `duality::Bool=true`: Switch to turn on/off duality whenever available (fixed labor and beta <= 1). Note that if `cross_good_congestion == true`, setting `duality = 2` uses an exact algorithm to compute the hessian, which has however not shown good numerical properties in some cases.
2424
- `warm_start::Bool=true`: Use the previous solution as a warm start for the next iteration
2525
- `kappa_min::Float64=1e-5`: Minimum value for road capacities K
2626
- `min_iter::Int64=20`: Minimum number of iterations

src/models/solve_allocation_by_duality_cgc.jl

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -129,7 +129,7 @@ function hessian_duality_cgc(
129129
graph = auxdata.graph
130130
nodes = graph.nodes
131131
kappa = auxdata.kappa
132-
inexact_algo = param.duality == 2
132+
inexact_algo = param.duality == true # param.duality == 2 for exact algorithm
133133
beta = param.beta
134134
m1dbeta = -1 / beta
135135
sigma = param.sigma

0 commit comments

Comments
 (0)