Skip to content

Commit a010c72

Browse files
authored
Merge pull request #78 from GPflow/r0.1
r0.1.0
2 parents 6b03e30 + 2d0937a commit a010c72

File tree

7 files changed

+27
-20
lines changed

7 files changed

+27
-20
lines changed

CONTRIBUTING.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -28,11 +28,11 @@ GPflowOpt currently tries to keep up with the GPflow master, though at some poin
2828
Changing the minimum required version of TensorFlow that we're compatible with requires a few tasks:
2929
- update versions in `setup.py`
3030
- update versions used on travis via `.travis.yml`
31-
- update version ussed by readthedocs.org via `docsrequire.txt`
31+
- update version used by readthedocs.org via `docsrequire.txt`
3232
- Increment the GPflowOpt version (see below).
3333

3434
## Version numbering
35-
The main purpose of versioning GPflowOpt is user convenience: to keep the number of releases down, we try to combine seversal PRs into one increment. As we work towards something that we might call 1.0, including changes to the GPflowOpt API. Minor version bumps (X.1) are used for updates to follow a new GPflow or TensorFlow API, or introduce incremental new features.
35+
The main purpose of versioning GPflowOpt is user convenience: to keep the number of releases down, we try to combine several PRs into one increment.
3636
When incrementing the version number, the following tasks are required:
3737
- Update the version in `GPflowOpt/_version.py`
3838
- Add a note to `RELEASE.md`

README.md

Lines changed: 13 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -1,12 +1,17 @@
1-
This is pre-release software, and as such is lacking testing, documentation and much functionality.
2-
3-
You can find the project proposal at: https://github.com/GPflow/GPflow/issues/397
4-
5-
If you're interested in helping shape GPflowOpt, let us know via an issue on this repo.
6-
71
# GPflowOpt
8-
Bayesian Optimization using GPflow
2+
GPflowOpt is a python package for Bayesian Optimization using [GPflow](https://github.com/GPflow/GPflow), and uses [TensorFlow](http://www.tensorflow.org). It was [initiated](https://github.com/GPflow/GPflow/issues/397) and is currently maintained by [Joachim van der Herten](http://sumo.intec.ugent.be/members?q=jvanderherten) and [Ivo Couckuyt](http://sumo.intec.ugent.be/icouckuy). The full list of contributors (in alphabetical order) is Ivo Couckuyt, Tom Dhaene, James Hensman, Nicolas Knudde, Alexander G. de G. Matthews and Joachim van der Herten. Special thanks also to all [GPflow contributors](http://github.com/GPflow/GPflow/graphs/contributors) as this package would not be able to exist without their effort.
93

104
[![Build Status](https://travis-ci.org/GPflow/GPflowOpt.svg?branch=master)](https://travis-ci.org/GPflow/GPflowOpt)
115
[![Coverage Status](https://codecov.io/gh/GPflow/GPflowOpt/branch/master/graph/badge.svg)](https://codecov.io/gh/GPflow/GPflowOpt)
12-
[![Documentation Status](https://readthedocs.org/projects/gpflowopt/badge/?version=latest)](http://gpflowopt.readthedocs.io/en/latest/?badge=latest)
6+
[![Documentation Status](https://readthedocs.org/projects/gpflowopt/badge/?version=latest)](http://gpflowopt.readthedocs.io/en/latest/?badge=latest)
7+
8+
# Install
9+
10+
The easiest way to install GPflowOpt involves cloning this repository and running
11+
```
12+
pip install . --process-dependency-links
13+
```
14+
in the source directory. This also installs all required dependencies (including TensorFlow, if needed). For more detailed installation instructions, see the [documentation](https://gpflowopt.readthedocs.io/en/latest/intro.html#install).
15+
16+
# Contributing
17+
If you are interested in contributing to this open source project, contact us through an issue on this repository. For more information, see the [notes for contributors](contributing.md).

RELEASE.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1 +1,2 @@
1-
Pre-release
1+
# Release 0.1.0
2+
Initial release of GPflowOpt

doc/source/intro.rst

Lines changed: 4 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -6,12 +6,11 @@ Introduction
66
It makes use of TensorFlow for computation of acquisition functions, to offer scalability, and avoid implementation of gradients.
77
The package was created, and is currently maintained by `Joachim van der Herten <http://sumo.intec.ugent.be/jvanderherten>`_ and `Ivo Couckuyt <http://sumo.intec.ugent.be/icouckuy>`_
88

9-
Currently the software is pre-release and under construction, hence it lacks a lot of functionality and testing. This documentation
10-
is also incomplete and under development. The project is open source: if you feel you have some relevant skills and are interested in
9+
The project is open source: if you feel you have some relevant skills and are interested in
1110
contributing then please contact us on `GitHub <https://github.com/GPflow/GPflowOpt>`_ by opening an issue or pull request.
1211

1312
Install
14-
--------
13+
-------
1514
1. Install package
1615

1716
A straightforward way to install GPflowOpt is to clone its repository and run
@@ -32,9 +31,8 @@ GPflowOpt is a pure python library so you could just add it to your python path.
3231
The tests require some additional dependencies that need to be installed first with
3332
``pip install -e .[test]``. Afterwards the tests can be run with ``python setup.py test``.
3433

35-
Similarly, to build the documentation,
36-
first install the extra dependencies with ``pip install -e .[docs]``.
37-
Then proceed with ``python setup.py build_sphinx``.
34+
Similarly, to build the documentation, first install the extra dependencies with
35+
``pip install -e .[docs]``. Then proceed with ``python setup.py build_sphinx``.
3836

3937
Getting started
4038
---------------

gpflowopt/_version.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -12,4 +12,4 @@
1212
# See the License for the specific language governing permissions and
1313
# limitations under the License.
1414

15-
__version__ = "pre-release" # pragma: no cover
15+
__version__ = "0.1.0" # pragma: no cover

gpflowopt/acquisition/acquisition.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -129,7 +129,8 @@ def enable_scaling(self, domain):
129129
Enables and configures the :class:`.DataScaler` objects wrapping the GP models.
130130
131131
Sets the _needs_setup attribute to True so the contained models are optimized and :meth:`setup` is run again
132-
right before evaluating the :class:`Acquisition` function.
132+
right before evaluating the :class:`Acquisition` function. Note that the models are modified directly and
133+
references to them outside of the object will also point to scaled instances.
133134
134135
:param domain: :class:`.Domain` object, the input transform of the data scalers is configured as a transform
135136
from domain to the unit cube with the same dimensionality.

gpflowopt/bo.py

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -74,7 +74,9 @@ def __init__(self, domain, acquisition, optimizer=None, initial=None, scaling=Tr
7474
the points as specified by the design.
7575
:param bool scaling: (boolean, default true) if set to true, the outputs are normalized, and the inputs are
7676
scaled to a unit cube. This only affects model training: calls to acquisition.data, as well as
77-
returned optima are unscaled (see :class:`~.DataScaler` for more details.)
77+
returned optima are unscaled (see :class:`~.DataScaler` for more details.). Note, the models contained by
78+
acquisition are modified directly, and so the references to the model outside of BayesianOptimizer now point
79+
to scaled models.
7880
:param int hyper_draws: (optional) Enable marginalization of model hyperparameters. By default, point estimates are
7981
used. If this parameter set to n, n hyperparameter draws from the likelihood distribution
8082
are obtained using Hamiltonian MC.

0 commit comments

Comments
 (0)