Skip to content

Commit 9384cc0

Browse files
authored
Merge branch 'pytorch:main' into main
2 parents b5127ca + 4632c36 commit 9384cc0

File tree

22 files changed

+612
-3666
lines changed

22 files changed

+612
-3666
lines changed

.github/workflows/nightly.yml

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -75,9 +75,8 @@ jobs:
7575
contents: write
7676

7777
run_tutorials:
78-
name: Run tutorials without smoke test on latest PyTorch / GPyTorch / Ax
78+
name: Run tutorials without smoke test on latest PyTorch / GPyTorch
7979
uses: ./.github/workflows/reusable_tutorials.yml
8080
with:
8181
smoke_test: false
8282
use_stable_pytorch_gpytorch: false
83-
use_stable_ax: false

.github/workflows/publish_website.yml

Lines changed: 0 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -47,11 +47,8 @@ jobs:
4747
- name: Install dependencies
4848
env:
4949
ALLOW_LATEST_GPYTORCH_LINOP: true
50-
ALLOW_BOTORCH_LATEST: true # Allow Ax to install w/ new BoTorch release.
5150
run: |
5251
uv pip install ."[dev, tutorials]"
53-
# There may not be a compatible Ax uv pip version, so we use the development version.
54-
uv pip install git+https://github.com/facebook/Ax.git
5552
- if: ${{ inputs.new_version }}
5653
name: Create new docusaurus version
5754
run: |

.github/workflows/reusable_tutorials.yml

Lines changed: 0 additions & 20 deletions
Original file line numberDiff line numberDiff line change
@@ -9,9 +9,6 @@ on:
99
use_stable_pytorch_gpytorch:
1010
required: true
1111
type: boolean
12-
use_stable_ax:
13-
required: true
14-
type: boolean
1512
workflow_call:
1613
inputs:
1714
smoke_test:
@@ -22,10 +19,6 @@ on:
2219
required: false
2320
type: boolean
2421
default: false
25-
use_stable_ax:
26-
required: false
27-
type: boolean
28-
default: false
2922

3023
jobs:
3124
tutorials:
@@ -66,19 +59,6 @@ jobs:
6659
ALLOW_LATEST_GPYTORCH_LINOP: true
6760
run: |
6861
uv pip install .[tutorials]
69-
- if: ${{ !inputs.use_stable_ax }}
70-
name: Install latest Ax
71-
env:
72-
# This is so Ax's setup doesn't install a pinned BoTorch version.
73-
ALLOW_BOTORCH_LATEST: true
74-
run: |
75-
uv pip install git+https://github.com/facebook/Ax.git
76-
- if: ${{ inputs.use_stable_ax }}
77-
name: Install stable Ax
78-
env:
79-
ALLOW_BOTORCH_LATEST: true
80-
run: |
81-
uv pip install ax-platform --no-binary ax-platform
8262
- if: ${{ inputs.smoke_test }}
8363
name: Run tutorials with smoke test
8464
run: |

.github/workflows/test_stable.yml

Lines changed: 3 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -52,18 +52,16 @@ jobs:
5252
run: |
5353
pytest -ra test_community/ --cov botorch_community/ --cov-report term-missing --cov-report xml:botorch_community_cov.xml
5454
55-
run_tutorials_stable_w_latest_ax:
56-
name: Run tutorials without smoke test on min req. versions of PyTorch & GPyTorch and latest Ax
55+
run_tutorials_stable:
56+
name: Run tutorials without smoke test on min req. versions of PyTorch & GPyTorch
5757
uses: ./.github/workflows/reusable_tutorials.yml
5858
with:
5959
smoke_test: false
6060
use_stable_pytorch_gpytorch: true
61-
use_stable_ax: false
6261

6362
run_tutorials_stable_smoke_test:
64-
name: Run tutorials with smoke test on min req. versions of PyTorch & GPyTorch and latest Ax
63+
name: Run tutorials with smoke test on min req. versions of PyTorch & GPyTorch
6564
uses: ./.github/workflows/reusable_tutorials.yml
6665
with:
6766
smoke_test: true
6867
use_stable_pytorch_gpytorch: true
69-
use_stable_ax: false

.github/workflows/tutorials_smoke_test.yml

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -10,9 +10,8 @@ on:
1010

1111
jobs:
1212
run_tutorials_with_smoke_test:
13-
name: Run tutorials with smoke test on latest PyTorch / GPyTorch / Ax
13+
name: Run tutorials with smoke test on latest PyTorch / GPyTorch
1414
uses: ./.github/workflows/reusable_tutorials.yml
1515
with:
1616
smoke_test: true
1717
use_stable_pytorch_gpytorch: false
18-
use_stable_ax: false

CHANGELOG.md

Lines changed: 51 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -2,6 +2,57 @@
22

33
The release log for BoTorch.
44

5+
## [0.14.0] -- May 6, 2025
6+
7+
#### Highlights
8+
* Prior Fitted Network (PFN) surrogate model integration (#2784).
9+
* Variational Bayesian last-layer models as surrogate `Model`s (#2754).
10+
* Probabilities of feasibility for classifier-based constraints in acquisition functions (#2776).
11+
12+
#### New Features
13+
* Helper for evaluating feasibility of candidate points (#2733).
14+
* Check for feasibility in `gen_candidates_scipy` and error out for infeasible candidates (#2737).
15+
* Return a feasible candidate if there is one and `return_best_only=True` (#2778).
16+
* Allow for observation noise without provided `evaluation_mask` mask in `ModelListGP` (#2735).
17+
* Implement incremental `qLogNEI` via `incremental` argument to `qLogNoisyExpectedImprovement` (#2760).
18+
* Add utility for computing AIC/BIC/MLL from a model (#2785).
19+
* New test functions:
20+
* Multi-fidelity test functions with discrete fidelities (#2796).
21+
* Keane bump function (#2802).
22+
* Mixed Ackley test function (#2830).
23+
* LABS test function (#2832).
24+
* Add parameter types to test functions to support problems defined in mixed / discrete spaces (#2809).
25+
* Add input validation to test functions (#2829).
26+
* Add `[q]LogProbabilityOfFeasibility` acquisition functions (#2815).
27+
28+
#### Bug Fixes
29+
* Remove hard-coded `dtype` from `best_f` buffers (#2725).
30+
* Fix `dtype/nan` issue in `StratifiedStandardize` (#2757).
31+
* Properly handle observed noise in `AdditiveMapSaasSingleTaskGP` with outcome transforms (#2763).
32+
* Do not count STOPPED (due to specified budget) as a model fitting failure (#2767).
33+
* Ensure that `initialize_q_batch` always includes the maximum value when called in batch mode (#2773).
34+
* Fix posterior with observation noise in batched MTGP models (#2782).
35+
* Detach tensor in `gen_candidates_scipy` to avoid test failure due to new warning (#2797).
36+
* Fix batch computation in Pivoted Cholesky (#2823).
37+
38+
#### Other Changes
39+
* Add optimal values for synthetic contrained optimization problems (#2730).
40+
* Update `max_hv` and reference point for Penicillin problem (#2771).
41+
* Add optimal value to SpeedReducer problem (#2799).
42+
* Update `nonlinear_constraint_is_feasible` to return a boolean tensor (#2731).
43+
* Restructure sampling methods for info-theoretic acquisition functions (#2753).
44+
* Prune baseline points in `qLogNEI` by default (#2762).
45+
* Misc updates to MES-based acqusition functions (#2769).
46+
* Pass option to reset submodules in train method for fully Bayesian models (#2783).
47+
* Put outcome transforms into train mode in model constructors (#2817).
48+
* `LogEI`: select `cache_root` based on model support (#2820).
49+
* Remove Ax dependency from BoTorch tutorials and reference Ax tutorials instead (#2839).
50+
51+
#### Deprecations and removals
52+
* Remove deprecated `gp_sampling` module (#2768).
53+
* Remove `qMultiObjectiveMaxValueEntropy` acquisition function (#2800).
54+
* Remove model converters (#2801).
55+
556

657
## [0.13.0] -- Feb 3, 2025
758

botorch_community/models/utils/prior_fitted_network.py

Lines changed: 1 addition & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -28,16 +28,12 @@ class ModelPaths(Enum):
2828

2929
pfns4bo_hebo = (
3030
"https://github.com/automl/PFNs4BO/raw/refs/heads/main/pfns4bo"
31-
"/final_models/hebo_morebudget_9_unused_features_3_userpriorperdim2_8.pt.gz"
31+
"/final_models/model_hebo_morebudget_9_unused_features_3.pt.gz"
3232
)
3333
pfns4bo_bnn = (
3434
"https://github.com/automl/PFNs4BO/raw/refs/heads/main/pfns4bo"
3535
"/final_models/model_sampled_warp_simple_mlp_for_hpob_46.pt.gz"
3636
)
37-
pfns4bo_hebo_userprior = (
38-
"https://github.com/automl/PFNs4BO/raw/refs/heads/main/pfns4bo"
39-
"/final_models/hebo_morebudget_9_unused_features_3_userpriorperdim2_8.pt.gz"
40-
)
4137

4238

4339
def download_model(

docs/botorch_and_ax.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -42,8 +42,8 @@ of surrogate model, or a new type of acquisition function, but leave the rest of
4242
the the Bayesian Optimization loop untouched. It is then straightforward to plug
4343
your custom BoTorch model or acquisition function into Ax to take advantage of
4444
Ax's various loop control APIs, as well as its powerful automated metadata
45-
management, data storage, etc. See the
46-
[Using a custom BoTorch model in Ax](tutorials/custom_botorch_model_in_ax)
45+
management, data storage, etc. See Ax's
46+
[Modular BoTorch tutorial](https://ax.dev/docs/tutorials/modular_botorch/)
4747
tutorial for more on how to do this.
4848

4949

docs/models.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -168,9 +168,9 @@ configurable model class whose implementation is difficult to understand.
168168

169169
Instead, we advocate that users implement their own models to cover more
170170
specialized use cases. The light-weight nature of BoTorch's Model API makes this
171-
easy to do. See the
172-
[Using a custom BoTorch model in Ax](tutorials/custom_botorch_model_in_ax)
173-
tutorial for an example.
171+
easy to do. See Ax's
172+
[Modular BoTorch tutorial](https://ax.dev/docs/tutorials/modular_botorch/)
173+
tutorial for an example for this and how to use such a custom model in Ax.
174174

175175
The BoTorch `Model` interface is light-weight and easy to extend. The only
176176
requirement for using BoTorch's Monte-Carlo based acquisition functions is that

docs/tutorials/index.mdx

Lines changed: 18 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -4,27 +4,32 @@ title: BoTorch Tutorials
44
The tutorials here will help you understand and use BoTorch in
55
your own work. They assume that you are familiar with both
66
Bayesian optimization (BO) and PyTorch.
7-
8-
If you are new to BO, we recommend you start with the
7+
* If you are new to BO, we recommend you start with the
98
[Ax docs](https://ax.dev/docs/bayesopt) and the
109
following
1110
[tutorial paper](https://arxiv.org/abs/1807.02811).
12-
13-
If you are new to PyTorch, the easiest way to get started is
11+
* If you are new to PyTorch, the easiest way to get started is
1412
with the [What is PyTorch?](https://pytorch.org/tutorials/beginner/blitz/tensor_tutorial.html#sphx-glr-beginner-blitz-tensor-tutorial-py)
1513
tutorial.
1614

17-
The BoTorch tutorials are grouped into the following four areas.
18-
1915

2016
<h4>Using BoTorch with Ax</h4>
21-
These tutorials give you an overview of how to leverage
22-
[Ax](https://ax.dev), a platform for sequential
23-
experimentation, in order to simplify the management of your BO
24-
loop. Doing so can help you focus on the main aspects of BO
25-
(models, acquisition functions, optimization of acquisition
26-
functions), rather than tedious loop control. See our
27-
[Documentation](/docs/botorch_and_ax)
17+
_For practitioners_ who are interested in running experiments
18+
to optimize various objectives using Bayesian optimization,
19+
we recommend using [Ax](https://ax.dev) rather than BoTorch.
20+
[Ax](https://ax.dev) provides a user-friendly interface for
21+
experiment configuration and orchestration, while choosing an
22+
appropriate Bayesian optimization algorithm to optimize the
23+
given objective, following BoTorch best practices.
24+
25+
_For researchers_ who are interested in running experiments with
26+
their custom BoTorch models and acquisition functions,
27+
[Ax](https://ax.dev)'s Modular BoTorch Interface offers a convenient
28+
way to leverage custom BoTorch objects while utilizing
29+
[Ax](https://ax.dev) experiment configuration and orchestration. Check out
30+
[Modular BoTorch tutorial](https://ax.dev/docs/tutorials/modular_botorch/)
31+
to learn how to use custom BoTorch objects in Ax!
32+
See [this documentation](/docs/botorch_and_ax)
2833
for additional information.
2934

3035
<h4>Full Optimization Loops</h4>

setup.py

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -17,10 +17,8 @@
1717
TEST_REQUIRES = ["pytest", "pytest-cov", "requests"]
1818
FMT_REQUIRES = ["flake8", "ufmt", "flake8-docstrings"]
1919
TUTORIALS_REQUIRES = [
20-
"ax-platform",
2120
"cma",
2221
"jupyter",
23-
"kaleido",
2422
"matplotlib",
2523
"memory_profiler",
2624
"papermill",

tutorials/bo_with_warped_gp/bo_with_warped_gp.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@
88
"\n",
99
"In this tutorial, we illustrate how to use learned input warping functions for robust Bayesian Optimization when the outcome may be non-stationary functions. When the lengthscales are non-stationarity in the raw input space, learning a warping function that maps raw inputs to a warped space where the lengthscales are stationary can be useful, because then standard stationary kernels can be used to effectively model the function.\n",
1010
"\n",
11-
"In general, for a relatively simple setup (like this one), we recommend using [Ax](https://ax.dev), since this will simplify your setup (including the amount of code you need to write) considerably. See the [Using BoTorch with Ax](/docs/tutorials/custom_botorch_model_in_ax) tutorial. To use input warping with `MODULAR_BOTORCH`, we can pass the `warp_tf`, constructed as below, by adding `input_transform=warp_tf` argument to the `Surrogate(...)` call. \n",
11+
"In general, for a relatively simple setup (like this one), we recommend using [Ax](https://ax.dev), since this will simplify your setup (including the amount of code you need to write) considerably. See Ax's [Modular BoTorch tutorial](https://ax.dev/docs/tutorials/modular_botorch/) tutorial. To use input warping with `MODULAR_BOTORCH`, we can pass the `warp_tf`, constructed as below, by adding `input_transform=warp_tf` argument to the `Surrogate(...)` call. \n",
1212
"\n",
1313
"We consider use a Kumaraswamy CDF as the class of input warping function and learn the concentration parameters ($a>0$ and $b>0$). Kumaraswamy CDFs are quite flexible and map inputs in [0, 1] to outputs in [0, 1]. This work follows the Beta CDF input warping proposed by Snoek et al., but replaces the Beta distribution Kumaraswamy distribution, which has a *differentiable* and closed-form CDF. \n",
1414
" \n",

tutorials/closed_loop_botorch_only/closed_loop_botorch_only.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@
1111
"\n",
1212
"In this tutorial, we illustrate how to implement a simple Bayesian Optimization (BO) closed loop in BoTorch.\n",
1313
"\n",
14-
"In general, we recommend for a relatively simple setup (like this one) to use Ax, since this will simplify your setup (including the amount of code you need to write) considerably. See the [Using BoTorch with Ax](/docs/tutorials/custom_botorch_model_in_ax) tutorial.\n",
14+
"In general, we recommend for a relatively simple setup (like this one) to use Ax, since this will simplify your setup (including the amount of code you need to write) considerably. See Ax's [Modular BoTorch tutorial](https://ax.dev/docs/tutorials/modular_botorch/) tutorial.\n",
1515
"\n",
1616
"However, you may want to do things that are not easily supported in Ax at this time (like running high-dimensional BO using a VAE+GP model that you jointly train on high-dimensional input data). If you find yourself in such a situation, you will need to write your own optimization loop, as we do in this tutorial.\n",
1717
"\n",

tutorials/constrained_multi_objective_bo/constrained_multi_objective_bo.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@
1111
"\n",
1212
"In this tutorial, we illustrate how to implement a constrained multi-objective (MO) Bayesian Optimization (BO) closed loop in BoTorch.\n",
1313
"\n",
14-
"In general, we recommend using [Ax](https://ax.dev) for a simple BO setup like this one, since this will simplify your setup (including the amount of code you need to write) considerably. See [here](https://ax.dev/docs/tutorials/multiobjective_optimization.html) for an Ax tutorial on MOBO. If desired, you can use a custom BoTorch model in Ax, following the [Using BoTorch with Ax](/docs/tutorials/custom_botorch_model_in_ax) tutorial. Given a `MultiObjective`, Ax will default to the $q$NEHVI acquisiton function. If desired, this can also be customized by adding `\"botorch_acqf_class\": <desired_botorch_acquisition_function_class>,` to the `model_kwargs`.\n",
14+
"In general, we recommend using [Ax](https://ax.dev) for a simple BO setup like this one, since this will simplify your setup (including the amount of code you need to write) considerably. See [here](https://ax.dev/docs/tutorials/multiobjective_optimization.html) for an Ax tutorial on MOBO. If desired, you can use a custom BoTorch model in Ax, following Ax's [Modular BoTorch tutorial](https://ax.dev/docs/tutorials/modular_botorch/) tutorial. Given a `MultiObjective`, Ax will default to the $q$NEHVI acquisiton function. If desired, this can also be customized by adding `\"botorch_acqf_class\": <desired_botorch_acquisition_function_class>,` to the `model_kwargs`.\n",
1515
"\n",
1616
"We use the parallel ParEGO ($q$ParEGO) [1] and parallel Noisy Expected Hypervolume Improvement ($q$NEHVI) [2] acquisition functions to optimize a synthetic C2-DTLZ2 test function with $M=2$ objectives, $V=1$ constraint, and $d=4$ parameters. The two objectives are\n",
1717
"$$f_1(\\mathbf x) = (1+ g(\\mathbf x_M))\\cos\\big(\\frac{\\pi}{2}x_1\\big)$$\n",

0 commit comments

Comments
 (0)