Skip to content

Commit 0171470

Browse files
Balandatfacebook-github-bot
authored andcommitted
Fix broken links pointing to the now-deleted custom_botorch_model_in_ax tutorial (#2840)
Summary: #2839 deleted this tutorial and left behind a number of broken links, including some that broke the website build. This updates other tutorials and the docs to fix this. Pull Request resolved: #2840 Reviewed By: esantorella Differential Revision: D74247879 Pulled By: Balandat fbshipit-source-id: d7f77a984b16eb251306da3e6094bde9c169eaf4
1 parent 1d4e14a commit 0171470

File tree

10 files changed

+13
-13
lines changed

10 files changed

+13
-13
lines changed

docs/botorch_and_ax.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -42,8 +42,8 @@ of surrogate model, or a new type of acquisition function, but leave the rest of
4242
the the Bayesian Optimization loop untouched. It is then straightforward to plug
4343
your custom BoTorch model or acquisition function into Ax to take advantage of
4444
Ax's various loop control APIs, as well as its powerful automated metadata
45-
management, data storage, etc. See the
46-
[Using a custom BoTorch model in Ax](tutorials/custom_botorch_model_in_ax)
45+
management, data storage, etc. See Ax's
46+
[Modular BoTorch tutorial](https://ax.dev/docs/tutorials/modular_botorch/)
4747
tutorial for more on how to do this.
4848

4949

docs/models.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -168,9 +168,9 @@ configurable model class whose implementation is difficult to understand.
168168

169169
Instead, we advocate that users implement their own models to cover more
170170
specialized use cases. The light-weight nature of BoTorch's Model API makes this
171-
easy to do. See the
172-
[Using a custom BoTorch model in Ax](tutorials/custom_botorch_model_in_ax)
173-
tutorial for an example.
171+
easy to do. See Ax's
172+
[Modular BoTorch tutorial](https://ax.dev/docs/tutorials/modular_botorch/)
173+
tutorial for an example for this and how to use such a custom model in Ax.
174174

175175
The BoTorch `Model` interface is light-weight and easy to extend. The only
176176
requirement for using BoTorch's Monte-Carlo based acquisition functions is that

tutorials/bo_with_warped_gp/bo_with_warped_gp.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@
88
"\n",
99
"In this tutorial, we illustrate how to use learned input warping functions for robust Bayesian Optimization when the outcome may be non-stationary functions. When the lengthscales are non-stationarity in the raw input space, learning a warping function that maps raw inputs to a warped space where the lengthscales are stationary can be useful, because then standard stationary kernels can be used to effectively model the function.\n",
1010
"\n",
11-
"In general, for a relatively simple setup (like this one), we recommend using [Ax](https://ax.dev), since this will simplify your setup (including the amount of code you need to write) considerably. See the [Using BoTorch with Ax](/docs/tutorials/custom_botorch_model_in_ax) tutorial. To use input warping with `MODULAR_BOTORCH`, we can pass the `warp_tf`, constructed as below, by adding `input_transform=warp_tf` argument to the `Surrogate(...)` call. \n",
11+
"In general, for a relatively simple setup (like this one), we recommend using [Ax](https://ax.dev), since this will simplify your setup (including the amount of code you need to write) considerably. See Ax's [Modular BoTorch tutorial](https://ax.dev/docs/tutorials/modular_botorch/) tutorial. To use input warping with `MODULAR_BOTORCH`, we can pass the `warp_tf`, constructed as below, by adding `input_transform=warp_tf` argument to the `Surrogate(...)` call. \n",
1212
"\n",
1313
"We consider use a Kumaraswamy CDF as the class of input warping function and learn the concentration parameters ($a>0$ and $b>0$). Kumaraswamy CDFs are quite flexible and map inputs in [0, 1] to outputs in [0, 1]. This work follows the Beta CDF input warping proposed by Snoek et al., but replaces the Beta distribution Kumaraswamy distribution, which has a *differentiable* and closed-form CDF. \n",
1414
" \n",

tutorials/closed_loop_botorch_only/closed_loop_botorch_only.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@
1111
"\n",
1212
"In this tutorial, we illustrate how to implement a simple Bayesian Optimization (BO) closed loop in BoTorch.\n",
1313
"\n",
14-
"In general, we recommend for a relatively simple setup (like this one) to use Ax, since this will simplify your setup (including the amount of code you need to write) considerably. See the [Using BoTorch with Ax](/docs/tutorials/custom_botorch_model_in_ax) tutorial.\n",
14+
"In general, we recommend for a relatively simple setup (like this one) to use Ax, since this will simplify your setup (including the amount of code you need to write) considerably. See Ax's [Modular BoTorch tutorial](https://ax.dev/docs/tutorials/modular_botorch/) tutorial.\n",
1515
"\n",
1616
"However, you may want to do things that are not easily supported in Ax at this time (like running high-dimensional BO using a VAE+GP model that you jointly train on high-dimensional input data). If you find yourself in such a situation, you will need to write your own optimization loop, as we do in this tutorial.\n",
1717
"\n",

tutorials/constrained_multi_objective_bo/constrained_multi_objective_bo.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@
1111
"\n",
1212
"In this tutorial, we illustrate how to implement a constrained multi-objective (MO) Bayesian Optimization (BO) closed loop in BoTorch.\n",
1313
"\n",
14-
"In general, we recommend using [Ax](https://ax.dev) for a simple BO setup like this one, since this will simplify your setup (including the amount of code you need to write) considerably. See [here](https://ax.dev/docs/tutorials/multiobjective_optimization.html) for an Ax tutorial on MOBO. If desired, you can use a custom BoTorch model in Ax, following the [Using BoTorch with Ax](/docs/tutorials/custom_botorch_model_in_ax) tutorial. Given a `MultiObjective`, Ax will default to the $q$NEHVI acquisiton function. If desired, this can also be customized by adding `\"botorch_acqf_class\": <desired_botorch_acquisition_function_class>,` to the `model_kwargs`.\n",
14+
"In general, we recommend using [Ax](https://ax.dev) for a simple BO setup like this one, since this will simplify your setup (including the amount of code you need to write) considerably. See [here](https://ax.dev/docs/tutorials/multiobjective_optimization.html) for an Ax tutorial on MOBO. If desired, you can use a custom BoTorch model in Ax, following Ax's [Modular BoTorch tutorial](https://ax.dev/docs/tutorials/modular_botorch/) tutorial. Given a `MultiObjective`, Ax will default to the $q$NEHVI acquisiton function. If desired, this can also be customized by adding `\"botorch_acqf_class\": <desired_botorch_acquisition_function_class>,` to the `model_kwargs`.\n",
1515
"\n",
1616
"We use the parallel ParEGO ($q$ParEGO) [1] and parallel Noisy Expected Hypervolume Improvement ($q$NEHVI) [2] acquisition functions to optimize a synthetic C2-DTLZ2 test function with $M=2$ objectives, $V=1$ constraint, and $d=4$ parameters. The two objectives are\n",
1717
"$$f_1(\\mathbf x) = (1+ g(\\mathbf x_M))\\cos\\big(\\frac{\\pi}{2}x_1\\big)$$\n",

tutorials/custom_model/custom_model.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@
1010
"- Posterior samples (using Pyro)\n",
1111
"- Ensemble of ML predictions\n",
1212
"\n",
13-
"This tutorial differs from the [Using a custom BoTorch model with Ax](https://botorch.org/docs/tutorials/custom_botorch_model_in_ax) tutorial by focusing more on authoring a new model that is compatible with the BoTorch and less on integrating a custom model with Ax's `botorch_modular` API."
13+
"This tutorial differs from the [Modular BoTorch tutorial](https://ax.dev/docs/tutorials/modular_botorch/) Ax tutorial by focusing more on authoring a new model that is compatible with the BoTorch and less on integrating a custom model with Ax's modular BoTorch model API."
1414
]
1515
},
1616
{

tutorials/fit_model_with_torch_optimizer/fit_model_with_torch_optimizer.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -282,7 +282,7 @@
282282
"source": [
283283
"### Interfacing with Ax\n",
284284
"\n",
285-
"It is simple to package up a custom optimizer loop like the one above and use it within Ax. As described in the [Using BoTorch with Ax tutorial](/docs/tutorials/custom_botorch_model_in_ax), this requires defining a custom `model_constructor` callable that can then be passed to the `get_botorch` factory function."
285+
"It is simple to package up a custom optimizer loop like the one above and use it within Ax. As described in Ax's [Modular BoTorch tutorial](https://ax.dev/docs/tutorials/modular_botorch/) tutorial, this requires defining a custom `model_constructor` callable that can then be passed to the `get_botorch` factory function."
286286
]
287287
},
288288
{

tutorials/max_value_entropy/max_value_entropy.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@
88
"\n",
99
"Max-value entropy search (MES) acquisition function quantifies the information gain about the maximum of a black-box function by observing this black-box function $f$ at the candidate set $\\{\\textbf{x}\\}$ (see [1, 2]). BoTorch provides implementations of the MES acquisition function and its multi-fidelity (MF) version with support for trace observations. In this tutorial, we explain at a high level how the MES acquisition function works, its implementation in BoTorch and how to use the MES acquisition function to query the next point in the optimization process. \n",
1010
"\n",
11-
"In general, we recommend using [Ax](https://ax.dev) for a simple BO setup like this one, since this will simplify your setup (including the amount of code you need to write) considerably. You can use a custom BoTorch model and acquisition function in Ax, following the [Using BoTorch with Ax](/docs/tutorials/custom_botorch_model_in_ax) tutorial. To use the MES acquisition function, it is sufficient to add `\"botorch_acqf_class\": qMaxValueEntropy,` to `model_kwargs`. The linked tutorial shows how to use a custom BoTorch model. If you'd like to let Ax choose which model to use based on the properties of the search space, you can skip the `surrogate` argument in `model_kwargs`.\n",
11+
"In general, we recommend using [Ax](https://ax.dev) for a simple BO setup like this one, since this will simplify your setup (including the amount of code you need to write) considerably. You can use a custom BoTorch model and acquisition function in Ax, following the Ax's [Modular BoTorch tutorial](https://ax.dev/docs/tutorials/modular_botorch/) tutorial. To use the MES acquisition function, it is sufficient to add `\"botorch_acqf_class\": qMaxValueEntropy,` to `model_kwargs`. The linked tutorial shows how to use a custom BoTorch model. If you'd like to let Ax choose which model to use based on the properties of the search space, you can skip the `surrogate` argument in `model_kwargs`.\n",
1212
"\n",
1313
"### 1. MES acquisition function for $q=1$ with noisy observation\n",
1414
"For illustrative purposes, we focus in this section on the non-q-batch-mode case ($q=1$). We also assume that the evaluation of the black-box function is noisy. Let us first introduce some notation: \n",

tutorials/multi_objective_bo/multi_objective_bo.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@
1111
"\n",
1212
"In this tutorial, we illustrate how to implement a simple multi-objective (MO) Bayesian Optimization (BO) closed loop in BoTorch.\n",
1313
"\n",
14-
"In general, we recommend using [Ax](https://ax.dev) for a simple BO setup like this one, since this will simplify your setup (including the amount of code you need to write) considerably. See [here](https://ax.dev/docs/tutorials/multiobjective_optimization.html) for an Ax tutorial on MOBO. If desired, you can use a custom BoTorch model in Ax, following the [Using BoTorch with Ax](/docs/tutorials/custom_botorch_model_in_ax) tutorial. Given a `MultiObjective`, Ax will default to the $q$NEHVI acquisiton function. If desired, this can also be customized by adding `\"botorch_acqf_class\": <desired_botorch_acquisition_function_class>,` to the `model_kwargs`.\n",
14+
"In general, we recommend using [Ax](https://ax.dev) for a simple BO setup like this one, since this will simplify your setup (including the amount of code you need to write) considerably. See [here](https://ax.dev/docs/tutorials/multiobjective_optimization.html) for an Ax tutorial on MOBO. If desired, you can use a custom BoTorch model in Ax, following Ax's [Modular BoTorch tutorial](https://ax.dev/docs/tutorials/modular_botorch/) tutorial. Given a `MultiObjective`, Ax will default to the $q$NEHVI acquisiton function. If desired, this can also be customized by adding `\"botorch_acqf_class\": <desired_botorch_acquisition_function_class>,` to the `model_kwargs`.\n",
1515
"\n",
1616
"We use the parallel ParEGO ($q$ParEGO) [1], parallel Expected Hypervolume Improvement ($q$EHVI) [1], and parallel Noisy Expected Hypervolume Improvement ($q$NEHVI) [2] acquisition functions to optimize a synthetic BraninCurrin problem test function with additive Gaussian observation noise over a 2-parameter search space [0,1]^2. See `botorch/test_functions/multi_objective.py` for details on BraninCurrin. The noise standard deviations are 15.19 and 0.63 for each objective, respectively.\n",
1717
"\n",

tutorials/one_shot_kg/one_shot_kg.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@
1515
"$$\n",
1616
"where $\\xi \\sim \\mathcal{P}(f(x') \\mid \\mathcal{D} \\cup \\mathcal{D}_{\\mathbf{x}})$ is the posterior at $x'$ conditioned on $\\mathcal{D}_{\\mathbf{x}}$, the (random) dataset observed at $\\mathbf{x}$, and $\\mu := \\max_{x}\\mathbb{E}[g(f(x)) \\mid \\mathcal{D}]$.\n",
1717
"\n",
18-
"In general, we recommend using [Ax](https://ax.dev) for a simple BO setup like this one, since this will simplify your setup (including the amount of code you need to write) considerably. You can use a custom BoTorch model and acquisition function in Ax, following the [Using BoTorch with Ax](/docs/tutorials/custom_botorch_model_in_ax) tutorial. To use the KG acquisition function, it is sufficient to add `\"botorch_acqf_class\": qKnowledgeGradient,` to `model_kwargs`. The linked tutorial shows how to use a custom BoTorch model. If you'd like to let Ax choose which model to use based on the properties of the search space, you can skip the `surrogate` argument in `model_kwargs`.\n",
18+
"In general, we recommend using [Ax](https://ax.dev) for a simple BO setup like this one, since this will simplify your setup (including the amount of code you need to write) considerably. You can use a custom BoTorch model and acquisition function in Ax, following Ax's [Modular BoTorch tutorial](https://ax.dev/docs/tutorials/modular_botorch/) tutorial. To use the KG acquisition function, it is sufficient to add `\"botorch_acqf_class\": qKnowledgeGradient,` to `model_kwargs`. The linked tutorial shows how to use a custom BoTorch model. If you'd like to let Ax choose which model to use based on the properties of the search space, you can skip the `surrogate` argument in `model_kwargs`.\n",
1919
"\n",
2020
"\n",
2121
"#### Optimizing KG\n",

0 commit comments

Comments
 (0)