Skip to content

Commit ccd8043

Browse files
committed
New global optimization functionality, bugfix for trust region radius when scaling bounds
1 parent 3009307 commit ccd8043

File tree

9 files changed

+175
-18
lines changed

9 files changed

+175
-18
lines changed

docs/advanced.rst

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ This section describes different optional user parameters available in Py-BOBYQA
44

55
In the last section (:doc:`userguide`), we introduced :code:`pybobyqa.solve()`, which has the optional input :code:`user_params`. This is a Python dictionary of user parameters. We will now go through the settings which can be changed in this way. More details are available in the paper [CFMR2018]_.
66

7-
The default values, used if no override is given, in some cases vary depending on whether :code:`objfun` has stochastic noise; that is, whether evaluating :code:`objfun(x)` several times at the same :code:`x` gives the same result or not. Whether or not this is the case is determined by the :code:`objfun_has_noise` input to :code:`pybobyqa.solve()` (and not by inspecting :code:`objfun`, for instance).
7+
The default values, used if no override is given, in some cases vary depending on whether :code:`objfun` has stochastic noise; that is, whether evaluating :code:`objfun(x)` several times at the same :code:`x` gives the same result or not. Whether or not this is the case is determined by the :code:`objfun_has_noise` input to :code:`pybobyqa.solve()` (and not by inspecting :code:`objfun`, for instance). Similarly, the default values depend on the input flag :code:`seek_global_minimum`, i.e. if a global minimum is desired.
88

99
General Algorithm Parameters
1010
----------------------------
@@ -59,8 +59,10 @@ Interpolation Management
5959

6060
Multiple Restarts
6161
-----------------
62-
* :code:`restarts.use_restarts` - Whether to do restarts when :math:`\rho_k` reaches :math:`\rho_{end}`, or (optionally) when all points are within noise level of :math:`f(x_k)`. Default is :code:`False` for smooth problems or :code:`True` for noisy problems.
62+
* :code:`restarts.use_restarts` - Whether to do restarts when :math:`\rho_k` reaches :math:`\rho_{end}`, or (optionally) when all points are within noise level of :math:`f(x_k)`. Default is :code:`False` for smooth problems or :code:`True` for noisy problems or when seeking a global minimum.
6363
* :code:`restarts.max_unsuccessful_restarts` - Maximum number of consecutive unsuccessful restarts allowed (i.e.~restarts which did not reduce the objective further). Default is 10.
64+
* :code:`restarts.max_unsuccessful_restarts_total` - Maximum number of total unsuccessful restarts allowed. Default is 20 when seeking a global minimum, otherwise it is :code:`maxfun` (i.e.~not restricted).
65+
* :code:`restarts.rhobeg_scale_after_unsuccessful_restart` - Factor to increase :math:`\rho_{beg}` by after unsuccessful restarts. Default is 1.1 when seeking a global minimum, otherwise it is 1.
6466
* :code:`restarts.rhoend_scale` - Factor to reduce :math:`\rho_{end}` by with each restart. Default is 1.
6567
* :code:`restarts.use_soft_restarts` - Whether to use soft or hard restarts. Default is :code:`True`.
6668
* :code:`restarts.soft.num_geom_steps` - For soft restarts, the number of points to move. Default is 3.

docs/history.rst

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -15,3 +15,9 @@ Version 1.0.2 (20 Jun 2018)
1515
* Extra optional input :code:`args` which passes through arguments for :code:`objfun` (pull request from `logangrado <https://github.com/logangrado>`_).
1616
* Bug fixes: default parameters for reduced initialization cost regime, returning correct value from safety steps, retrieving dependencies during installation.
1717

18+
Version 1.1a0 (13 Jul 2018)
19+
---------------------------
20+
* Extra parameters to control the trust region radius over multiple restarts, designed for global optimization.
21+
* New input flag :code:`seek_global_minimum` to set sensible default parameters for global optimization. New example script to demonstrate this functionality.
22+
* Bug fix: default trust region radius when scaling variables within bounds.
23+

docs/index.rst

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -25,6 +25,8 @@ Full details of the Py-BOBYQA algorithm are given in our paper: C. Cartis, J. Fi
2525

2626
If you are interested in solving least-squares minimization problems, you may wish to try `DFO-LS <https://github.com/numericalalgorithmsgroup/dfols>`_, which has the same features as Py-BOBYQA (plus some more), and exploits the least-squares problem structure, so performs better on such problems.
2727

28+
**New feature: global optimization (July 2018)!** Py-BOBYQA now has a heuristic for global optimization (see :doc:`userguide` for details). As this is a heuristic, there are no guarantees it will find a global minimum, but it is more likely to escape local minima if there are better values nearby.
29+
2830
Py-BOBYQA is released under the GNU General Public License. Please `contact NAG <http://www.nag.com/content/worldwide-contact-information>`_ for alternative licensing.
2931

3032
.. toctree::

docs/userguide.rst

Lines changed: 84 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -68,19 +68,21 @@ The :code:`solve` function has several optional arguments which the user may pro
6868
pybobyqa.solve(objfun, x0, args=(), bounds=None, npt=None,
6969
rhobeg=None, rhoend=1e-8, maxfun=None, nsamples=None,
7070
user_params=None, objfun_has_noise=False,
71+
seek_global_minimum=False,
7172
scaling_within_bounds=False)
7273
7374
These arguments are:
7475

7576
* :code:`args` - a tuple of extra arguments passed to the objective function.
7677
* :code:`bounds` - a tuple :code:`(lower, upper)` with the vectors :math:`a` and :math:`b` of lower and upper bounds on :math:`x` (default is :math:`a_i=-10^{20}` and :math:`b_i=10^{20}`). To set bounds for either :code:`lower` or :code:`upper`, but not both, pass a tuple :code:`(lower, None)` or :code:`(None, upper)`.
7778
* :code:`npt` - the number of interpolation points to use (default is :code:`2*len(x0)+1`). Py-BOBYQA requires :code:`n+1 <= npt <= (n+1)*(n+2)/2` for a problem with :code:`len(x0)=n`. Larger values are particularly useful for noisy problems.
78-
* :code:`rhobeg` - the initial value of the trust region radius (default is :math:`0.1\max(\|x_0\|_{\infty}, 1)`).
79+
* :code:`rhobeg` - the initial value of the trust region radius (default is 0.1 if :code:`scaling_within_bounds=True`, otherwise :math:`0.1\max(\|x_0\|_{\infty}, 1)`).
7980
* :code:`rhoend` - minimum allowed value of trust region radius, which determines when a successful termination occurs (default is :math:`10^{-8}`).
8081
* :code:`maxfun` - the maximum number of objective evaluations the algorithm may request (default is :math:`\min(100(n+1),1000)`).
8182
* :code:`nsamples` - a Python function :code:`nsamples(delta, rho, iter, nrestarts)` which returns the number of times to evaluate :code:`objfun` at a given point. This is only applicable for objectives with stochastic noise, when averaging multiple evaluations at the same point produces a more accurate value. The input parameters are the trust region radius (:code:`delta`), the lower bound on the trust region radius (:code:`rho`), how many iterations the algorithm has been running for (:code:`iter`), and how many restarts have been performed (:code:`nrestarts`). Default is no averaging (i.e. :code:`nsamples(delta, rho, iter, nrestarts)=1`).
8283
* :code:`user_params` - a Python dictionary :code:`{'param1': val1, 'param2':val2, ...}` of optional parameters. A full list of available options is given in the next section :doc:`advanced`.
8384
* :code:`objfun_has_noise` - a flag to indicate whether or not :code:`objfun` has stochastic noise; i.e. will calling :code:`objfun(x)` multiple times at the same value of :code:`x` give different results? This is used to set some sensible default parameters (including using multiple restarts), all of which can be overridden by the values provided in :code:`user_params`.
85+
* :code:`seek_global_minimum` - a flag to indicate whether to search for a global minimum, rather than a local minimum. This is used to set some sensible default parameters (including using multiple restarts), all of which can be overridden by the values provided in :code:`user_params`. If :code:`True`, both upper and lower bounds must be set.
8486
* :code:`scaling_within_bounds` - a flag to indicate whether the algorithm should internally shift and scale the entries of :code:`x` so that the bounds become :math:`0 \leq x \leq 1`. This is useful is you are setting :code:`bounds` and the bounds have different orders of magnitude. If :code:`scaling_within_bounds=True`, the values of :code:`rhobeg` and :code:`rhoend` apply to the *shifted* variables.
8587

8688
In general when using optimization software, it is good practice to scale your variables so that moving each by a given amount has approximately the same impact on the objective function.
@@ -305,6 +307,87 @@ This time, we find the true solution, and better estimates of the gradient and H
305307
******************************
306308
307309
310+
Example: Global Optimization
311+
----------------------------
312+
The following example shows how to use the global optimization features of Py-BOBYQA. Here, we try to minimize the Freudenstein and Roth function (problem 2 in J.J. Moré, B.S. Garbow, B.S. and K.E. Hillstrom, Testing Unconstrained Optimization Software, *ACM Trans. Math. Software* 7:1 (1981), 17-41). This function has two local minima, one of which is global.
313+
314+
.. code-block:: python
315+
316+
# Py-BOBYQA example: globally minimize the Freudenstein and Roth function
317+
from __future__ import print_function
318+
import numpy as np
319+
import pybobyqa
320+
321+
# Define the objective function
322+
# This function has a local minimum f = 48.98
323+
# at x = np.array([11.41, -0.8968])
324+
# and a global minimum f = 0 at x = np.array([5.0, 4.0])
325+
def freudenstein_roth(x):
326+
r1 = -13.0 + x[0] + ((5.0 - x[1]) * x[1] - 2.0) * x[1]
327+
r2 = -29.0 + x[0] + ((1.0 + x[1]) * x[1] - 14.0) * x[1]
328+
return r1 ** 2 + r2 ** 2
329+
330+
# Define the starting point
331+
x0 = np.array([5.0, -20.0])
332+
333+
# Define bounds (required for global optimization)
334+
lower = np.array([-30.0, -30.0])
335+
upper = np.array([30.0, 30.0])
336+
337+
# Set random seed (for reproducibility)
338+
np.random.seed(0)
339+
340+
print("First run - search for local minimum only")
341+
print("")
342+
soln = pybobyqa.solve(freudenstein_roth, x0, maxfun=500,
343+
bounds=(lower, upper))
344+
print(soln)
345+
346+
print("")
347+
print("")
348+
349+
print("Second run - search for global minimum")
350+
print("")
351+
soln = pybobyqa.solve(freudenstein_roth, x0, maxfun=500,
352+
bounds=(lower, upper),
353+
seek_global_minimum=True)
354+
print(soln)
355+
356+
The output of this is:
357+
358+
.. code-block:: none
359+
360+
First run - search for local minimum only
361+
362+
****** Py-BOBYQA Results ******
363+
Solution xmin = [ 11.41277906 -0.89680525]
364+
Objective value f(xmin) = 48.98425368
365+
Needed 203 objective evaluations (at 203 points)
366+
Approximate gradient = [ -1.61348180e-06 -3.61662651e-07]
367+
Approximate Hessian = [[ 132.10265455 -45.5426821 ]
368+
[ -45.5426821 976.15808779]]
369+
Exit flag = 0
370+
Success: rho has reached rhoend
371+
******************************
372+
373+
374+
375+
Second run - search for global minimum
376+
377+
****** Py-BOBYQA Results ******
378+
Solution xmin = [ 5. 4.]
379+
Objective value f(xmin) = 9.734692105e-19
380+
Needed 500 objective evaluations (at 500 points)
381+
Did a total of 4 runs
382+
Approximate gradient = [ 4.28964221e-08 4.58344260e-07]
383+
Approximate Hessian = [[ 4.06992486 61.15006935]
384+
[ 61.15006935 3728.06826545]]
385+
Exit flag = 1
386+
Warning (max evals): Objective has been called MAXFUN times
387+
******************************
388+
389+
As we can see, the :code:`seek_global_minimum` flag helped Py-BOBYQA escape the local minimum from the first run, and find the global minimum.
390+
308391
References
309392
----------
310393

examples/global_opt_demo.py

Lines changed: 35 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,35 @@
1+
# Py-BOBYQA example: globally minimize the Freudenstein and Roth function
2+
from __future__ import print_function
3+
import numpy as np
4+
import pybobyqa
5+
6+
# Define the objective function
7+
# This function has a local minimum f = 48.98 at x = np.array([11.41, -0.8968])
8+
# and a global minimum f = 0 at x = np.array([5.0, 4.0])
9+
def freudenstein_roth(x):
10+
r1 = -13.0 + x[0] + ((5.0 - x[1]) * x[1] - 2.0) * x[1]
11+
r2 = -29.0 + x[0] + ((1.0 + x[1]) * x[1] - 14.0) * x[1]
12+
return r1 ** 2 + r2 ** 2
13+
14+
# Define the starting point
15+
x0 = np.array([5.0, -20.0])
16+
17+
# Define bounds (required for global optimization)
18+
lower = np.array([-30.0, -30.0])
19+
upper = np.array([30.0, 30.0])
20+
21+
# Set random seed (for reproducibility)
22+
np.random.seed(0)
23+
24+
print("First run - search for local minimum only")
25+
print("")
26+
soln = pybobyqa.solve(freudenstein_roth, x0, maxfun=500, bounds=(lower, upper))
27+
print(soln)
28+
29+
print("")
30+
print("")
31+
32+
print("Second run - search for global minimum")
33+
print("")
34+
soln = pybobyqa.solve(freudenstein_roth, x0, maxfun=500, bounds=(lower, upper), seek_global_minimum=True)
35+
print(soln)

pybobyqa/controller.py

Lines changed: 9 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -119,6 +119,7 @@ def __init__(self, objfun, x0, args, f0, f0_nsamples, xl, xu, npt, rhobeg, rhoen
119119
self.last_successful_iter = 0 # when ||d|| >= rho
120120
# For counting the number of soft restarts
121121
self.last_successful_run = 0
122+
self.total_unsuccessful_restarts = 0
122123
self.last_run_fopt = f0
123124
self.scaling_changes = scaling_changes
124125

@@ -465,16 +466,22 @@ def soft_restart(self, number_of_samples, nruns_so_far, params, x_in_abs_coords_
465466
# A successful run is one where we reduced fopt
466467
if self.model.fopt() < self.last_run_fopt:
467468
self.last_successful_run = nruns_so_far
469+
else:
470+
# Unsuccessful run
471+
self.total_unsuccessful_restarts += 1
472+
self.rhobeg = self.rhobeg * params("restarts.rhobeg_scale_after_unsuccessful_restart")
468473
self.last_run_fopt = self.model.fopt()
469474

470475
ok_to_do_restart = (nruns_so_far - self.last_successful_run < params("restarts.max_unsuccessful_restarts")) and \
471-
(self.nf < self.maxfun)
476+
(self.nf < self.maxfun) and self.total_unsuccessful_restarts < params("restarts.max_unsuccessful_restarts_total")
472477

473478
if not ok_to_do_restart:
474479
# last outputs are (exit_flag, exit_str, return_to_new_tr_iteration)
475480
exit_info = ExitInformation(EXIT_MAXFUN_WARNING, "Objective has been called MAXFUN times")
476481
if nruns_so_far - self.last_successful_run >= params("restarts.max_unsuccessful_restarts"):
477-
exit_info = ExitInformation(EXIT_SUCCESS, "Reached maximum number of unsuccessful restarts")
482+
exit_info = ExitInformation(EXIT_SUCCESS, "Reached maximum number of consecutive unsuccessful restarts")
483+
elif self.total_unsuccessful_restarts >= params("restarts.max_unsuccessful_restarts_total"):
484+
exit_info = ExitInformation(EXIT_SUCCESS, "Reached maximum total number of unsuccessful restarts")
478485
return exit_info
479486

480487
# Save current best point and the one the user wanted to save too

pybobyqa/params.py

Lines changed: 8 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -32,7 +32,7 @@
3232

3333

3434
class ParameterList(object):
35-
def __init__(self, n, npt, maxfun, objfun_has_noise=False):
35+
def __init__(self, n, npt, maxfun, objfun_has_noise=False, seek_global_minimum=False):
3636
self.params = {}
3737
# Rounding error constant (for shifting base)
3838
self.params["general.rounding_error_constant"] = 0.1 # 0.1 in DFBOLS, 1e-3 in BOBYQA
@@ -71,8 +71,10 @@ def __init__(self, n, npt, maxfun, objfun_has_noise=False):
7171
self.params["noise.multiplicative_noise_level"] = None
7272
self.params["noise.additive_noise_level"] = None
7373
# Restarts
74-
self.params["restarts.use_restarts"] = True if objfun_has_noise else False
74+
self.params["restarts.use_restarts"] = True if (objfun_has_noise or seek_global_minimum) else False
7575
self.params["restarts.max_unsuccessful_restarts"] = 10
76+
self.params["restarts.max_unsuccessful_restarts_total"] = 20 if seek_global_minimum else maxfun # i.e. not used (in line with previous behaviour)
77+
self.params["restarts.rhobeg_scale_after_unsuccessful_restart"] = 1.1 if seek_global_minimum else 1.0
7678
self.params["restarts.rhoend_scale"] = 1.0 # how much to decrease rhoend by after each restart
7779
self.params["restarts.use_soft_restarts"] = True
7880
self.params["restarts.soft.num_geom_steps"] = 3
@@ -161,6 +163,10 @@ def param_type(self, key, npt):
161163
type_str, nonetype_ok, lower, upper = 'bool', False, None, None
162164
elif key == "restarts.max_unsuccessful_restarts":
163165
type_str, nonetype_ok, lower, upper = 'int', False, 0, None
166+
elif key == "restarts.max_unsuccessful_restarts_total":
167+
type_str, nonetype_ok, lower, upper = 'int', False, 0, None
168+
elif key == "restarts.rhobeg_scale_after_unsuccessful_restart":
169+
type_str, nonetype_ok, lower, upper = 'float', False, 0.0, None
164170
elif key == "restarts.rhoend_scale":
165171
type_str, nonetype_ok, lower, upper = 'float', False, 0.0, None
166172
elif key == "restarts.use_soft_restarts":

0 commit comments

Comments
 (0)