Skip to content

Dakota complains analytic gradients when specified #143

Answered by jadamstephens
k-ingles asked this question in Q&A
Discussion options

You must be logged in to vote

Dakota has an optimization called the active set vector. It asks your driver only for the information that the method (optpp_q_newton) asks for. It would be any combination of the function value, the gradient, and the hessian of each response.

See here for more information.

You may want to disable the active set vector using the interface keywords deactivate active_set_vector, which will cause Dakota to request and expect everything you said your driver can return - function values and analytic gradients - for every evaluation.

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@k-ingles
Comment options

Answer selected by k-ingles
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants