Skip to content

Commit 69b5e8a

Browse files
committed
Fix some test issues
1 parent 5118245 commit 69b5e8a

File tree

5 files changed

+102
-8
lines changed

5 files changed

+102
-8
lines changed

Project.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -16,6 +16,6 @@ UnsafeArrays = "c4a57d5a-5b31-53a6-b365-19f8c011fbd6"
1616
DocStringExtensions = "0.8"
1717
ForwardDiff = "0.10"
1818
MathOptInterface = "0.9"
19-
RobotDynamics = "0.1.3"
19+
RobotDynamics = "0.1.2"
2020
StaticArrays = "0.12"
2121
UnsafeArrays = "1"

docs/make.jl

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,8 @@ makedocs(
1919
"API" => [
2020
"cost_api.md",
2121
"constraint_api.md",
22-
"problem.md"
22+
"problem.md",
23+
"nlp.md"
2324
]
2425
]
2526
)

docs/src/nlp.md

Lines changed: 73 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,73 @@
1+
```@meta
2+
CurrentModule = TrajectoryOptimization
3+
```
4+
5+
# Converting to an NLP
6+
7+
```@contents
8+
Pages = ["nlp.md"]
9+
```
10+
11+
Trajectory optimization problems are really just nonlinear programs (NLPs). A handful of
12+
high-quality NLP solvers exist, such as Ipopt, Snopt, and KNITRO. TrajectoryOptimization
13+
provides an interface that allows methods that are amenable to use with a general-purpose
14+
NLP solver. In the NLP, the states and constraints at every knot point are concatenated into
15+
a single large vector of decision variables, and the cost hessian and constraint Jacobians
16+
are represented as large, sparse matrices.
17+
18+
## Important Types
19+
Below is the documentation for the types used to represent a trajectory optimization problem
20+
as an NLP:
21+
22+
```@docs
23+
NLPData
24+
NLPConstraintSet
25+
QuadraticViewCost
26+
ViewKnotPoint
27+
TrajData
28+
NLPTraj
29+
```
30+
31+
## The `TrajOptNLP` type
32+
The most important type is the [`TrajOptNLP`](@ref), which is a single struct that has all
33+
the required methods to evaluate the trajectory optimization problem as an NLP.
34+
35+
```@docs
36+
TrajOptNLP
37+
```
38+
39+
### Interface
40+
Use the following methods on a `TrajOptNLP` `nlp`. Unless otherwise noted, `Z` is a single
41+
vector of `NN` decision variables (where `NN` is the total number of states and controls across
42+
all knot points).
43+
```@docs
44+
eval_f
45+
grad_f!
46+
hess_f!
47+
hess_f_structure
48+
eval_c!
49+
jac_c!
50+
jacobian_structure
51+
hess_L!
52+
```
53+
54+
The following methods are useful to getting important information that is typically required
55+
by an NLP solver
56+
```@docs
57+
primal_bounds!
58+
constraint_type
59+
```
60+
61+
## MathOptInterface
62+
The `TrajOptNLP` can be used to set up an `MathOptInterface.AbstractOptimizer` to solve
63+
the trajectory optimization problem. For example if we want to use Ipopt and have already
64+
set up our `TrajOptNLP`, we can solve it using `build_MOI!(nlp, optimizer)`:
65+
66+
```julia
67+
using Ipopt
68+
using MathOptInterface
69+
nlp = TrajOptNLP(...) # assume this is already set up
70+
optimizer = Ipopt.Optimizer()
71+
TrajectoryOptimization.build_MOI!(nlp, optimizer)
72+
MathOptInterface.optimize!(optimizer)
73+
```

src/constraint_list.jl

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -124,7 +124,7 @@ end
124124

125125
# Iteration
126126
Base.iterate(cons::ConstraintList) = length(cons) == 0 ? nothing : (cons[1], 1)
127-
Base.iterate(cons::ConstraintList, i) = i < length(cons) ? (cons[i+1], i+1) : nothing
127+
Base.iterate(cons::ConstraintList, i::Int) = i < length(cons) ? (cons[i+1], i+1) : nothing
128128
@inline Base.length(cons::ConstraintList) = length(cons.constraints)
129129
Base.IteratorSize(::ConstraintList) = Base.HasLength()
130130
Base.IteratorEltype(::ConstraintList) = Base.HasEltype()
@@ -150,7 +150,7 @@ end
150150
num_constraints(::TrajOptNLP)
151151
152152
Return a vector of length `N` constaining the total number of constraint values at each
153-
knot point.
153+
knot point.
154154
"""
155155
@inline num_constraints(cons::ConstraintList) = cons.p
156156

src/nlp.jl

Lines changed: 24 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,20 @@
11
#--- NLPData
2+
"""
3+
Holds all the required data structures for evaluating a trajectory optimization problem as
4+
an NLP. It represents the cost gradient, Hessian, constraints, and constraint Jacobians
5+
as large, sparse arrays, as applicable.
6+
7+
# Constructors
8+
NLPData(G, g, zL, zU, D, d, λ)
9+
NLPData(G, g, zL, zU, D, d, λ, v, r, c)
10+
NLPData(NN, P, [nD]) # suggested constructor
11+
12+
where `G` and `g` are the cost function gradient and hessian of size `(NN,NN)` and `(NN,)`,
13+
`zL` and `zU` are the lower and upper bounds on the `NN` primal variables,
14+
`D` and `d` are the constraint jacobian and violation of size `(P,NN)` and `(P,)`, and
15+
`v`, `r`, `c` are the values, rows, and columns of the non-zero elements of the costraint
16+
Jacobian, all of length `nD`.
17+
"""
218
mutable struct NLPData{T}
319
G::SparseMatrixCSC{T,Int}
420
g::Vector{T}
@@ -49,6 +65,8 @@ end
4965
NLPConstraintSet{T}
5066
5167
Constraint set that updates views to the NLP constraint vector and Jacobian.
68+
69+
The views can be reset to new arrays using `reset_views!(::NLPConstraintSet, ::NLPData)`
5270
"""
5371
struct NLPConstraintSet{T} <: AbstractConstraintSet
5472
convals::Vector{ConVal}
@@ -375,7 +393,7 @@ mutable struct NLPOpts{T}
375393
end
376394

377395
function NLPOpts(;
378-
reset_views::Bool = false
396+
reset_views::Bool = false
379397
)
380398
NLPOpts{Float64}(reset_views)
381399
end
@@ -493,7 +511,8 @@ end
493511
"""
494512
grad_f!(nlp::TrajOptNLP, Z, g)
495513
496-
Evaluate the gradient of the cost function
514+
Evaluate the gradient of the cost function for the vector of decision variables `Z`, storing
515+
the result in the vector `g`.
497516
"""
498517
function grad_f!(nlp::TrajOptNLP, Z=get_primals(nlp), g=nlp.data.g)
499518
N = num_knotpoints(nlp)
@@ -513,7 +532,8 @@ end
513532
"""
514533
hess_f!(nlp::TrajOptNLP, Z, G)
515534
516-
Evaluate the hessian of the cost function `G`.
535+
Evaluate the hessian of the cost function for the vector of decision variables `Z`,
536+
storing the result in `G`, a sparse matrix.
517537
"""
518538
function hess_f!(nlp::TrajOptNLP, Z=get_primals(nlp), G=nlp.data.G)
519539
N = num_knotpoints(nlp)
@@ -623,7 +643,7 @@ function jac_c!(nlp::TrajOptNLP, Z=get_primals(nlp), C::AbstractArray=nlp.data.D
623643
end
624644

625645
"""
626-
jac_structure(nlp::TrajOptNLP)
646+
jacobian_structure(nlp::TrajOptNLP)
627647
628648
Returns a sparse matrix `D` of the same size as the constraint Jacobian, corresponding to
629649
the sparsity pattern of the constraint Jacobian. Additionally, `D[i,j]` is either zero or

0 commit comments

Comments
 (0)