File tree Expand file tree Collapse file tree 1 file changed +9
-1
lines changed Expand file tree Collapse file tree 1 file changed +9
-1
lines changed Original file line number Diff line number Diff line change @@ -897,8 +897,11 @@ following cost function:
897
897
.. math ::
898
898
:name: regularized-logistic-loss
899
899
900
- \min _{w} C \sum _{i=1 }^n \left (-y_i \log (\hat {p}(X_i)) - (1 - y_i) \log (1 - \hat {p}(X_i))\right ) + r(w).
900
+ \min _{w} C \sum _{i=1 }^n s_i \left (-y_i \log (\hat {p}(X_i)) - (1 - y_i) \log (1 - \hat {p}(X_i))\right ) + r(w),
901
901
902
+ where :math: `{s_i}` corresponds to the weights assigned by the user to a
903
+ specific training sample (the vector :math: `s` is formed by element-wise
904
+ multiplication of the class weights and sample weights).
902
905
903
906
We currently provide four choices for the regularization term :math: `r(w)` via
904
907
the `penalty ` argument:
@@ -920,6 +923,11 @@ controls the strength of :math:`\ell_1` regularization vs. :math:`\ell_2`
920
923
regularization. Elastic-Net is equivalent to :math: `\ell _1 ` when
921
924
:math: `\rho = 1 ` and equivalent to :math: `\ell _2 ` when :math: `\rho =0 `.
922
925
926
+ Note that the scale of the class weights and the sample weights will influence
927
+ the optimization problem. For instance, multiplying the sample weights by a
928
+ constant :math: `b>0 ` is equivalent to multiplying the (inverse) regularization
929
+ strength `C ` by :math: `b`.
930
+
923
931
Multinomial Case
924
932
----------------
925
933
You can’t perform that action at this time.
0 commit comments