Skip to content

Commit 6c6f926

Browse files
committed
minor fixes to doc
1 parent 3c4a807 commit 6c6f926

File tree

6 files changed

+6
-6
lines changed

6 files changed

+6
-6
lines changed

src/metrics/mod.rs

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@
44
//! In a feedback loop you build your model first, then you get feedback from metrics, improve it and repeat until your model achieve desirable performance.
55
//! Evaluation metrics helps to explain the performance of a model and compare models based on an objective criterion.
66
//!
7-
//! Choosing the right metric is crucial while evaluating machine learning models. In smartcore you will find metrics for these classes of ML models:
7+
//! Choosing the right metric is crucial while evaluating machine learning models. In `smartcore` you will find metrics for these classes of ML models:
88
//!
99
//! * [Classification metrics](struct.ClassificationMetrics.html)
1010
//! * [Regression metrics](struct.RegressionMetrics.html)

src/model_selection/mod.rs

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@
77
//! Splitting data into multiple subsets helps us to find the right combination of hyperparameters, estimate model performance and choose the right model for
88
//! the data.
99
//!
10-
//! In smartcore a random split into training and test sets can be quickly computed with the [train_test_split](./fn.train_test_split.html) helper function.
10+
//! In `smartcore` a random split into training and test sets can be quickly computed with the [train_test_split](./fn.train_test_split.html) helper function.
1111
//!
1212
//! ```
1313
//! use smartcore::linalg::basic::matrix::DenseMatrix;

src/neighbors/knn_classifier.rs

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
//! # K Nearest Neighbors Classifier
22
//!
3-
//! smartcore relies on 2 backend algorithms to speedup KNN queries:
3+
//! `smartcore` relies on 2 backend algorithms to speedup KNN queries:
44
//! * [`LinearSearch`](../../algorithm/neighbour/linear_search/index.html)
55
//! * [`CoverTree`](../../algorithm/neighbour/cover_tree/index.html)
66
//!

src/svm/mod.rs

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@
99
//! SVM is memory efficient since it uses only a subset of training data to find a decision boundary. This subset is called support vectors.
1010
//!
1111
//! In SVM distance between a data point and the support vectors is defined by the kernel function.
12-
//! smartcore supports multiple kernel functions but you can always define a new kernel function by implementing the `Kernel` trait. Not all functions can be a kernel.
12+
//! `smartcore` supports multiple kernel functions but you can always define a new kernel function by implementing the `Kernel` trait. Not all functions can be a kernel.
1313
//! Building a new kernel requires a good mathematical understanding of the [Mercer theorem](https://en.wikipedia.org/wiki/Mercer%27s_theorem)
1414
//! that gives necessary and sufficient condition for a function to be a kernel function.
1515
//!

src/svm/svc.rs

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@
2020
//!
2121
//! Where \\( m \\) is a number of training samples, \\( y_i \\) is a label value (either 1 or -1) and \\(\langle\vec{w}, \vec{x}_i \rangle + b\\) is a decision boundary.
2222
//!
23-
//! To solve this optimization problem, smartcore uses an [approximate SVM solver](https://leon.bottou.org/projects/lasvm).
23+
//! To solve this optimization problem, `smartcore` uses an [approximate SVM solver](https://leon.bottou.org/projects/lasvm).
2424
//! The optimizer reaches accuracies similar to that of a real SVM after performing two passes through the training examples. You can choose the number of passes
2525
//! through the data that the algorithm takes by changing the `epoch` parameter of the classifier.
2626
//!

src/tree/decision_tree_regressor.rs

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@
1111
//!
1212
//! where \\(\hat{y}_{Rk}\\) is the mean response for the training observations withing region _k_.
1313
//!
14-
//! smartcore uses recursive binary splitting approach to build \\(R_1, R_2, ..., R_K\\) regions. The approach begins at the top of the tree and then successively splits the predictor space
14+
//! `smartcore` uses recursive binary splitting approach to build \\(R_1, R_2, ..., R_K\\) regions. The approach begins at the top of the tree and then successively splits the predictor space
1515
//! one predictor at a time. At each step of the tree-building process, the best split is made at that particular step, rather than looking ahead and picking a split that will lead to a better
1616
//! tree in some future step.
1717
//!

0 commit comments

Comments
 (0)