Skip to content

Commit f1273b7

Browse files
authored
Doc Add dropdowns to 1.10.decision trees (scikit-learn#26699)
1 parent aae5837 commit f1273b7

File tree

1 file changed

+29
-7
lines changed

1 file changed

+29
-7
lines changed

doc/modules/tree.rst

Lines changed: 29 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -146,6 +146,10 @@ Once trained, you can plot the tree with the :func:`plot_tree` function::
146146
:scale: 75
147147
:align: center
148148

149+
|details-start|
150+
**Alternative ways to export trees**
151+
|details-split|
152+
149153
We can also export the tree in `Graphviz
150154
<https://www.graphviz.org/>`_ format using the :func:`export_graphviz`
151155
exporter. If you use the `conda <https://conda.io>`_ package manager, the graphviz binaries
@@ -212,6 +216,8 @@ of external libraries and is more compact:
212216
| | |--- class: 2
213217
<BLANKLINE>
214218

219+
|details-end|
220+
215221
.. topic:: Examples:
216222

217223
* :ref:`sphx_glr_auto_examples_tree_plot_iris_dtc.py`
@@ -281,7 +287,6 @@ of shape ``(n_samples, n_outputs)`` then the resulting estimator will:
281287
* Output a list of n_output arrays of class probabilities upon
282288
``predict_proba``.
283289

284-
285290
The use of multi-output trees for regression is demonstrated in
286291
:ref:`sphx_glr_auto_examples_tree_plot_tree_regression_multioutput.py`. In this example, the input
287292
X is a single real value and the outputs Y are the sine and cosine of X.
@@ -303,16 +308,20 @@ the lower half of those faces.
303308

304309
.. topic:: Examples:
305310

306-
* :ref:`sphx_glr_auto_examples_tree_plot_tree_regression_multioutput.py`
307-
* :ref:`sphx_glr_auto_examples_miscellaneous_plot_multioutput_face_completion.py`
311+
* :ref:`sphx_glr_auto_examples_tree_plot_tree_regression_multioutput.py`
312+
* :ref:`sphx_glr_auto_examples_miscellaneous_plot_multioutput_face_completion.py`
308313

309-
.. topic:: References:
314+
|details-start|
315+
**References**
316+
|details-split|
310317

311318
* M. Dumont et al, `Fast multi-class image annotation with random subwindows
312319
and multiple output randomized trees
313320
<http://www.montefiore.ulg.ac.be/services/stochastic/pubs/2009/DMWG09/dumont-visapp09-shortpaper.pdf>`_, International Conference on
314321
Computer Vision Theory and Applications 2009
315322

323+
|details-end|
324+
316325
.. _tree_complexity:
317326

318327
Complexity
@@ -403,6 +412,10 @@ Tree algorithms: ID3, C4.5, C5.0 and CART
403412
What are all the various decision tree algorithms and how do they differ
404413
from each other? Which one is implemented in scikit-learn?
405414

415+
|details-start|
416+
**Various decision tree algorithms**
417+
|details-split|
418+
406419
ID3_ (Iterative Dichotomiser 3) was developed in 1986 by Ross Quinlan.
407420
The algorithm creates a multiway tree, finding for each node (i.e. in
408421
a greedy manner) the categorical feature that will yield the largest
@@ -428,6 +441,8 @@ it differs in that it supports numerical target variables (regression) and
428441
does not compute rule sets. CART constructs binary trees using the feature
429442
and threshold that yield the largest information gain at each node.
430443

444+
|details-end|
445+
431446
scikit-learn uses an optimized version of the CART algorithm; however, the
432447
scikit-learn implementation does not support categorical variables for now.
433448

@@ -500,8 +515,9 @@ Log Loss or Entropy:
500515
501516
H(Q_m) = - \sum_k p_{mk} \log(p_{mk})
502517
503-
504-
.. note::
518+
|details-start|
519+
Shannon entropy:
520+
|details-split|
505521

506522
The entropy criterion computes the Shannon entropy of the possible classes. It
507523
takes the class frequencies of the training data points that reached a given
@@ -531,6 +547,8 @@ Log Loss or Entropy:
531547
532548
\mathrm{LL}(D, T) = \sum_{m \in T} \frac{n_m}{n} H(Q_m)
533549
550+
|details-end|
551+
534552
Regression criteria
535553
-------------------
536554

@@ -671,7 +689,9 @@ be pruned. This process stops when the pruned tree's minimal
671689

672690
* :ref:`sphx_glr_auto_examples_tree_plot_cost_complexity_pruning.py`
673691

674-
.. topic:: References:
692+
|details-start|
693+
**References**
694+
|details-split|
675695

676696
.. [BRE] L. Breiman, J. Friedman, R. Olshen, and C. Stone. Classification
677697
and Regression Trees. Wadsworth, Belmont, CA, 1984.
@@ -685,3 +705,5 @@ be pruned. This process stops when the pruned tree's minimal
685705

686706
* T. Hastie, R. Tibshirani and J. Friedman. Elements of Statistical
687707
Learning, Springer, 2009.
708+
709+
|details-end|

0 commit comments

Comments
 (0)