Skip to content

Commit 5dbb8f5

Browse files
authored
ENH replace loss module Gradient boosting (scikit-learn#26278)
1 parent 51ca717 commit 5dbb8f5

File tree

6 files changed

+728
-1489
lines changed

6 files changed

+728
-1489
lines changed

doc/common_pitfalls.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -211,7 +211,7 @@ method is used during fitting and predicting::
211211
>>> from sklearn.model_selection import cross_val_score
212212
>>> scores = cross_val_score(pipeline, X, y)
213213
>>> print(f"Mean accuracy: {scores.mean():.2f}+/-{scores.std():.2f}")
214-
Mean accuracy: 0.45+/-0.07
214+
Mean accuracy: 0.46+/-0.07
215215

216216
How to avoid data leakage
217217
-------------------------

doc/whats_new/v1.4.rst

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -150,6 +150,11 @@ Changelog
150150
:pr:`13649` by :user:`Samuel Ronsin <samronsin>`,
151151
initiated by :user:`Patrick O'Reilly <pat-oreilly>`.
152152

153+
- |Efficiency| :class:`ensemble.GradientBoostingClassifier` is faster,
154+
for binary and in particular for multiclass problems thanks to the private loss
155+
function module.
156+
:pr:`26278` by :user:`Christian Lorentzen <lorentzenchr>`.
157+
153158
- |Efficiency| Improves runtime and memory usage for
154159
:class:`ensemble.GradientBoostingClassifier` and
155160
:class:`ensemble.GradientBoostingRegressor` when trained on sparse data.

0 commit comments

Comments
 (0)