Skip to content

Commit 3aa1054

Browse files
DOC Add link to Early Stopping example in Gradient Boosting (scikit-learn#27025)
Co-authored-by: Adrin Jalali <adrin.jalali@gmail.com>
1 parent 751ccc0 commit 3aa1054

File tree

6 files changed

+16
-10
lines changed

6 files changed

+16
-10
lines changed

doc/modules/ensemble.rst

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -740,8 +740,10 @@ of ``learning_rate`` require larger numbers of weak learners to maintain
740740
a constant training error. Empirical evidence suggests that small
741741
values of ``learning_rate`` favor better test error. [HTF]_
742742
recommend to set the learning rate to a small constant
743-
(e.g. ``learning_rate <= 0.1``) and choose ``n_estimators`` by early
744-
stopping. For a more detailed discussion of the interaction between
743+
(e.g. ``learning_rate <= 0.1``) and choose ``n_estimators`` large enough
744+
that early stopping applies,
745+
see :ref:`sphx_glr_auto_examples_ensemble_plot_gradient_boosting_early_stopping.py`
746+
for a more detailed discussion of the interaction between
745747
``learning_rate`` and ``n_estimators`` see [R2007]_.
746748

747749
Subsampling

examples/ensemble/plot_gradient_boosting_early_stopping.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
"""
22
===================================
3-
Early stopping of Gradient Boosting
3+
Early stopping in Gradient Boosting
44
===================================
55
66
Gradient boosting is an ensembling technique where several weak learners

sklearn/ensemble/_gb.py

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1282,6 +1282,8 @@ class GradientBoostingClassifier(ClassifierMixin, BaseGradientBoosting):
12821282
improving in all of the previous ``n_iter_no_change`` numbers of
12831283
iterations. The split is stratified.
12841284
Values must be in the range `[1, inf)`.
1285+
See
1286+
:ref:`sphx_glr_auto_examples_ensemble_plot_gradient_boosting_early_stopping.py`.
12851287
12861288
.. versionadded:: 0.20
12871289
@@ -1891,6 +1893,8 @@ class GradientBoostingRegressor(RegressorMixin, BaseGradientBoosting):
18911893
improving in all of the previous ``n_iter_no_change`` numbers of
18921894
iterations.
18931895
Values must be in the range `[1, inf)`.
1896+
See
1897+
:ref:`sphx_glr_auto_examples_ensemble_plot_gradient_boosting_early_stopping.py`.
18941898
18951899
.. versionadded:: 0.20
18961900

sklearn/linear_model/_passive_aggressive.py

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -35,11 +35,11 @@ class PassiveAggressiveClassifier(BaseSGDClassifier):
3535
.. versionadded:: 0.19
3636
3737
early_stopping : bool, default=False
38-
Whether to use early stopping to terminate training when validation.
38+
Whether to use early stopping to terminate training when validation
3939
score is not improving. If set to True, it will automatically set aside
4040
a stratified fraction of training data as validation and terminate
41-
training when validation score is not improving by at least tol for
42-
n_iter_no_change consecutive epochs.
41+
training when validation score is not improving by at least `tol` for
42+
`n_iter_no_change` consecutive epochs.
4343
4444
.. versionadded:: 0.20
4545

sklearn/linear_model/_perceptron.py

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -68,11 +68,11 @@ class Perceptron(BaseSGDClassifier):
6868
See :term:`Glossary <random_state>`.
6969
7070
early_stopping : bool, default=False
71-
Whether to use early stopping to terminate training when validation.
71+
Whether to use early stopping to terminate training when validation
7272
score is not improving. If set to True, it will automatically set aside
7373
a stratified fraction of training data as validation and terminate
74-
training when validation score is not improving by at least tol for
75-
n_iter_no_change consecutive epochs.
74+
training when validation score is not improving by at least `tol` for
75+
`n_iter_no_change` consecutive epochs.
7676
7777
.. versionadded:: 0.20
7878

sklearn/neural_network/_multilayer_perceptron.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -887,7 +887,7 @@ class MLPClassifier(ClassifierMixin, BaseMultilayerPerceptron):
887887
Whether to use early stopping to terminate training when validation
888888
score is not improving. If set to true, it will automatically set
889889
aside 10% of training data as validation and terminate training when
890-
validation score is not improving by at least tol for
890+
validation score is not improving by at least ``tol`` for
891891
``n_iter_no_change`` consecutive epochs. The split is stratified,
892892
except in a multilabel setting.
893893
If early stopping is False, then the training stops when the training

0 commit comments

Comments
 (0)