Skip to content

Commit 1dca7dc

Browse files
committed
Update the name of the decorrelation function
1 parent 5bdedb8 commit 1dca7dc

30 files changed

+249
-249
lines changed

README.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,12 +1,12 @@
1-
# Heuristic Multidimensional Correlation Analysis: Goal-Driven Spatial Transformation Matrices
1+
# Iterative Decorrelation Analysis (IDeA) and the Unit of Measurement Preserving Spatial Transformation Matrices (UPSTM)
22

33
![](images/paste-706F2F78.png)
44

5-
Fig. 1. The weights ($w_j^i$) of the GDSTM matrix (**W**) are estimated by the HMCA algorithm.
5+
Fig. 1. The weights ($w_j^i$) of the UPSTM matrix (**W**) are estimated by the IDeA algorithm.
66

77
\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_
88

9-
Many multidimensional/multimodality data sets contain continuous features that are co-linear, correlated or have some association between them. The goal of spatial transformations is to find a set of [latent variables](https://en.wikipedia.org/wiki/Latent_and_observable_variables) with minimum data correlation; hence downstream data analysis be simplified. Common data transformation matrices include statistically driven approaches such as [principal component analysis](https://en.wikipedia.org/wiki/Principal_component_analysis) (PCA), [explanatory factor analysis](https://en.wikipedia.org/wiki/Exploratory_factor_analysis) (EFA), and [canonical-correlation analysis](https://en.wikipedia.org/wiki/Canonical_correlation) (CCA). An heuristic alternative for these two statistical approaches is the heuristic-multidimensional correlation analysis (HMCA). The main advantage of the heuristic approach is that it is driven by specific requirements for the output generated. The specific requirements are:
9+
Many multidimensional/multimodality data sets contain continuous features that are co-linear, correlated or have some association between them. The goal of spatial transformations is to find a set of [latent variables](https://en.wikipedia.org/wiki/Latent_and_observable_variables) with minimum data correlation; hence downstream data analysis be simplified. Common data transformation matrices include statistically driven approaches such as [principal component analysis](https://en.wikipedia.org/wiki/Principal_component_analysis) (PCA), [explanatory factor analysis](https://en.wikipedia.org/wiki/Exploratory_factor_analysis) (EFA), and [canonical-correlation analysis](https://en.wikipedia.org/wiki/Canonical_correlation) (CCA). An algoritm alternative for these two statistical approaches is the Iterative Decorrelation Analysis (HMCA). The main advantage of the iterative approach is that it is driven by specific output requirements. The specific requirements are:
1010

1111
1. All output variables $Q=(q_1,...q_n)$ have a parent input variable $X=(x_1,...x_n)$ (See Fig 1.)
1212

@@ -48,7 +48,7 @@ library("FRESA.CAD")
4848
data('iris')
4949
5050
## HMCA Decorrelation at 0.25 threshold, pearson and fast estimation
51-
irisDecor <- GDSTMDecorrelation(iris,thr=0.25)
51+
irisDecor <- IDeA(iris,thr=0.25)
5252
5353
### Print the latent variables
5454
print(getLatentCoefficients(irisDecor))
@@ -77,9 +77,9 @@ This repository show some examples of the **FRESA.CAD::GDSTMDecorrelation(), FRE
7777
- **irisexample.R** showcase the effect of the HMCA algorithm on the iris data set.
7878

7979
- Here an example of the output
80-
- ![](images/paste-8B4C5746.png)
80+
- ![](images/paste-AB4FBF9C.png)
8181

82-
- ![](images/paste-AF234B49.png)
82+
- ![](images/paste-913BE963.png)
8383

8484
- **ParkisonAnalysis_TrainTest.Rmd** is a demo shows the use of GDSTM and BSWiMS to gain insight of the features associated with a relevant outcome. Highlight process and functions that will aid authors to discern and statistically describe the relevant features associated with an specific outcome.
8585

RMD/COVID_19_TrainTest.Rmd

Lines changed: 11 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -21,17 +21,17 @@ knitr::opts_chunk$set(collapse = TRUE, warning = FALSE, message = FALSE,comment
2121
2222
```
2323

24-
# Effect of GDSTM-Based Decorrelation on Feature Discovery
24+
# Effect of UPSTM-Based Decorrelation on Feature Discovery
2525

26-
Here I showcase of to use BSWiMS feature selection/modeling function coupled with Goal Driven Sparse Transformation Matrix (GDSTM) as a pre-processing step to decorrelate highly correlated features. The aim(s) are:
26+
Here I showcase of to use BSWiMS feature selection/modeling function coupled with Goal Driven Sparse Transformation Matrix (UPSTM) as a pre-processing step to decorrelate highly correlated features. The aim(s) are:
2727

2828
1. To improve model performance by uncovering the hidden information between correlated features.
2929

3030
2. To simplify the interpretation of the machine learning models.
3131

3232
This demo will use:
3333

34-
- *FRESA.CAD::GDSTMDecorrelation()*. For Decorrelation of Multidimensional data sets
34+
- *FRESA.CAD::IDeA()*. For Decorrelation of Multidimensional data sets
3535

3636
- *FRESA.CAD::getDerivedCoefficients()*. For the extraction of the model of the newly discovered of decorrelated features.
3737

@@ -131,15 +131,15 @@ pander::pander(table(testSet$PCR_result))
131131

132132
#### Decorrelation: Training and Testing Sets Creation
133133

134-
I compute a decorrelated version of the training and testing sets using the *GDSTMDecorrelation()* function of FRESA.CAD. The first decorrelation will be driven by features associated with the outcome. The second decorrelation will find the GDSTM without the outcome restriction.
134+
I compute a decorrelated version of the training and testing sets using the *IDeA()* function of FRESA.CAD. The first decorrelation will be driven by features associated with the outcome. The second decorrelation will find the UPSTM without the outcome restriction.
135135

136136
```{r results = "asis", warning = FALSE, dpi=600, fig.height= 6.0, fig.width= 8.0}
137-
## The GDSTM transformation driven by the Outcome
138-
deTrain <- GDSTMDecorrelation(trainSet,Outcome="PCR_result",thr=0.8,verbose = TRUE)
137+
## The UPSTM transformation driven by the Outcome
138+
deTrain <- IDeA(trainSet,Outcome="PCR_result",thr=0.8,verbose = TRUE)
139139
deTest <- predictDecorrelate(deTrain,testSet)
140140
141-
## The GDSTM transformation without outcome
142-
deTrainU <- GDSTMDecorrelation(trainSet,thr=0.8,verbose = TRUE)
141+
## The UPSTM transformation without outcome
142+
deTrainU <- IDeA(trainSet,thr=0.8,verbose = TRUE)
143143
deTestU <- predictDecorrelate(deTrainU,testSet)
144144
145145
```
@@ -155,7 +155,7 @@ gplots::heatmap.2(abs(cormat),
155155
scale = "none",
156156
mar = c(10,10),
157157
col=rev(heat.colors(5)),
158-
main = "Test Set Correlation after GDSTM",
158+
main = "Test Set Correlation after UPSTM",
159159
cexRow = 0.35,
160160
cexCol = 0.35,
161161
key.title=NA,
@@ -198,7 +198,7 @@ cvBSWiMSDeCor <- randomCV(COVID_19_MS,
198198
DECOR.control=list(Outcome="PCR_result",thr=0.8)
199199
)
200200
201-
bpDecor <- predictionStats_binary(cvBSWiMSDeCor$medianTest,"BSWiMS Outcome-Driven GDSTM",cex=0.60)
201+
bpDecor <- predictionStats_binary(cvBSWiMSDeCor$medianTest,"BSWiMS Outcome-Driven UPSTM",cex=0.60)
202202
pander::pander(bpDecor$CM.analysis$tab)
203203
pander::pander(bpDecor$accc)
204204
pander::pander(bpDecor$aucs)
@@ -228,7 +228,7 @@ cvBSWiMSDeCorU <- randomCV(COVID_19_MS,
228228
DECOR.control=list(thr=0.8)
229229
)
230230
231-
bpDecorU <- predictionStats_binary(cvBSWiMSDeCorU$medianTest,"BSWiMS Data Driven GDSTM",cex=0.60)
231+
bpDecorU <- predictionStats_binary(cvBSWiMSDeCorU$medianTest,"BSWiMS Data Driven UPSTM",cex=0.60)
232232
pander::pander(bpDecorU$CM.analysis$tab)
233233
pander::pander(bpDecorU$accc)
234234
pander::pander(bpDecorU$aucs)

RMD/FDeA_ML_testing_ARCENE.Rmd

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,13 +1,13 @@
11
---
2-
title: 'Filtered Fit: FDeA and the GDSTM'
2+
title: 'Filtered Fit: FDeA and the UPSTM'
33
output:
44
html_document:
55
df_print: paged
66
---
77

8-
## Filtered ML fit and the GDSTM with FRESA.CAD
8+
## Filtered ML fit and the UPSTM with FRESA.CAD
99

10-
Here we make use of the **FRESA.CAD::filteredfit()** function to train ML models with and without GDSTM on the ARCENE data set.
10+
Here we make use of the **FRESA.CAD::filteredfit()** function to train ML models with and without UPSTM on the ARCENE data set.
1111

1212
> Isabelle Guyon, Steve R. Gunn, Asa Ben-Hur, Gideon Dror, 2004. Result analysis of the NIPS 2003 feature selection challenge. In: NIPS. [$$Web Link$$](http://books.nips.cc/papers/files/nips17/NIPS2004_0194.pdf). *from: <https://archive.ics.uci.edu/ml/datasets/Arcene>*
1313
>
@@ -138,7 +138,7 @@ pander::pander(psRaw$aucs)
138138
139139
psDecor <- predictionStats_binary(cbind(datasetframe_test$Labels,
140140
predict(mLASSODecor,datasetframe_test)),
141-
"LASSO after GDSTM",cex=0.75)
141+
"LASSO after UPSTM",cex=0.75)
142142
pander::pander(psDecor$aucs)
143143
144144

RMD/FDeA_ML_testing_DARWIN.Rmd

Lines changed: 11 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
---
2-
title: "FDeA and the GDSTM: DARWIN Data Set"
2+
title: "FDeA and the UPSTM: DARWIN Data Set"
33
output:
44
html_document:
55
df_print: paged
@@ -14,17 +14,17 @@ knitr::opts_chunk$set(collapse = TRUE, warning = FALSE, message = FALSE,comment
1414
1515
```
1616

17-
# Effect of GDSTM-Based Decorrelation on Feature Discovery: The DARWIN Evaluation
17+
# Effect of UPSTM-Based Decorrelation on Feature Discovery: The DARWIN Evaluation
1818

19-
Here I showcase of to use BSWiMS feature selection/modeling function coupled with Goal Driven Sparse Transformation Matrix (GDSTM) as a pre-processing step to decorrelate highly correlated features. The aim(s) are:
19+
Here I showcase of to use BSWiMS feature selection/modeling function coupled with Goal Driven Sparse Transformation Matrix (UPSTM) as a pre-processing step to decorrelate highly correlated features. The aim(s) are:
2020

2121
1. To improve model performance by uncovering the hidden information between correlated features.
2222

2323
2. To simplify the interpretation of the machine learning models.
2424

2525
This demo will use:
2626

27-
- FRESA.CAD::GDSTMDecorrelation(). For Decorrelation of Multidimensional data sets
27+
- FRESA.CAD::IDeA(). For Decorrelation of Multidimensional data sets
2828

2929
- FRESA.CAD::getDerivedCoefficients(). For the extraction of the model of the newly discovered of decorrelated features.
3030

@@ -141,15 +141,15 @@ pander::pander(table(testSet$class))
141141

142142
#### Decorrelation: Training and Testing Sets Creation
143143

144-
I compute a decorrelated version of the training and testing sets using the *GDSTMDecorrelation()* function of FRESA.CAD. The first decorrelation will be driven by features associated with the outcome. The second decorrelation will find the GDSTM without the outcome restriction.
144+
I compute a decorrelated version of the training and testing sets using the *IDeA()* function of FRESA.CAD. The first decorrelation will be driven by features associated with the outcome. The second decorrelation will find the UPSTM without the outcome restriction.
145145

146146
```{r results = "asis", warning = FALSE, dpi=600, fig.height= 6.0, fig.width= 8.0}
147-
## The GDSTM transformation driven by the Outcome
148-
deTrain <- GDSTMDecorrelation(trainSet,Outcome="class",thr=0.8,verbose = TRUE,skipRelaxed=FALSE)
147+
## The UPSTM transformation driven by the Outcome
148+
deTrain <- IDeA(trainSet,Outcome="class",thr=0.8,verbose = TRUE,skipRelaxed=FALSE)
149149
deTest <- predictDecorrelate(deTrain,testSet)
150150
151-
## The GDSTM transformation without outcome
152-
deTrainU <- GDSTMDecorrelation(trainSet,thr=0.8,verbose = TRUE,skipRelaxed=FALSE)
151+
## The UPSTM transformation without outcome
152+
deTrainU <- IDeA(trainSet,thr=0.8,verbose = TRUE,skipRelaxed=FALSE)
153153
deTestU <- predictDecorrelate(deTrainU,testSet)
154154
155155
```
@@ -166,7 +166,7 @@ gplots::heatmap.2(abs(cormat),
166166
scale = "none",
167167
mar = c(10,10),
168168
col=rev(heat.colors(5)),
169-
main = "Test Set Correlation after GDSTM",
169+
main = "Test Set Correlation after UPSTM",
170170
cexRow = 0.45,
171171
cexCol = 0.45,
172172
key.title=NA,
@@ -209,7 +209,7 @@ cvBSWiMSDeCor <- randomCV(DARWIN,
209209
DECOR.control=list(Outcome="class",thr=0.8,skipRelaxed=FALSE)
210210
)
211211
212-
bpDecor <- predictionStats_binary(cvBSWiMSDeCor$medianTest,"Outcome-Driven GDSTM",cex=0.60)
212+
bpDecor <- predictionStats_binary(cvBSWiMSDeCor$medianTest,"Outcome-Driven UPSTM",cex=0.60)
213213
pander::pander(bpDecor$CM.analysis$tab)
214214
pander::pander(bpDecor$accc)
215215
pander::pander(bpDecor$aucs)

RMD/FDeA_ML_testing_sonar.Rmd

Lines changed: 27 additions & 27 deletions
Original file line numberDiff line numberDiff line change
@@ -1,15 +1,15 @@
11
---
2-
title: "FDeA and the GDSTM: Sonar Tests"
2+
title: "FDeA and the UPSTM: Sonar Tests"
33
output:
44
html_document:
55
df_print: paged
66
editor_options:
77
chunk_output_type: console
88
---
99

10-
## Filtered ML fit and the GDSTM with FRESA.CAD
10+
## Filtered ML fit and the UPSTM with FRESA.CAD
1111

12-
Here we make use of the **FRESA.CAD::filteredfit()** function to train ML models with and without GDSTM.
12+
Here we make use of the **FRESA.CAD::filteredfit()** function to train ML models with and without UPSTM.
1313

1414
Naive-Bayes (NB) and LASSO models are used in this demo.
1515

@@ -214,13 +214,13 @@ pander::pander(psPCA$aucs)
214214
AllRocAUC <- rbind(AllRocAUC,psPCA$aucs)
215215
216216
psDecor <- predictionStats_binary(cbind(classoutcomes,prDecor),
217-
"NB GDSTM",cex=0.75)
217+
"NB UPSTM",cex=0.75)
218218
pander::pander(psDecor$aucs)
219219
AllRocAUC <- rbind(AllRocAUC,psDecor$aucs);
220220
221221
222222
psDecor2 <- predictionStats_binary(cbind(classoutcomes,prDecor2),
223-
"NB GDSTM Spearman",cex=0.75)
223+
"NB UPSTM Spearman",cex=0.75)
224224
pander::pander(psDecor2$aucs)
225225
AllRocAUC <- rbind(AllRocAUC,psDecor2$aucs);
226226
@@ -239,14 +239,14 @@ AllRocAUC <- rbind(AllRocAUC,psPCA$aucs)
239239
240240
psDecor <- predictionStats_binary(cbind(classoutcomes,
241241
predict(mLASSODecor,datasetframe_test)),
242-
"LASSO GDSTM",cex=0.75)
242+
"LASSO UPSTM",cex=0.75)
243243
pander::pander(psDecor$aucs)
244244
AllRocAUC <- rbind(AllRocAUC,psDecor$aucs);
245245
246246
247247
psDecor2 <- predictionStats_binary(cbind(classoutcomes,
248248
predict(mLASSODecor2,datasetframe_test)),
249-
"LASSO GDSTM Spearman",cex=0.75)
249+
"LASSO UPSTM Spearman",cex=0.75)
250250
pander::pander(psDecor2$aucs)
251251
AllRocAUC <- rbind(AllRocAUC,psDecor2$aucs);
252252
@@ -256,8 +256,8 @@ AllRocAUC <- rbind(AllRocAUC,psDecor2$aucs);
256256

257257
```{r results = "asis", warning = FALSE, dpi=600, fig.height= 6.0, fig.width= 8.0}
258258
259-
rownames(AllRocAUC) <- c("NB:Raw","NB:PCA","NB:GDSTM_P","NB:GDSTM_S",
260-
"LASSO:Raw","LASSO:PCA","LASSO:GDSTM_P","LASSO:GDSTM_S")
259+
rownames(AllRocAUC) <- c("NB:Raw","NB:PCA","NB:UPSTM_P","NB:UPSTM_S",
260+
"LASSO:Raw","LASSO:PCA","LASSO:UPSTM_P","LASSO:UPSTM_S")
261261
pander::pander(AllRocAUC)
262262
bpROCAUC <- barPlotCiError(as.matrix(AllRocAUC),
263263
metricname = "ROCAUC",
@@ -273,33 +273,33 @@ bpROCAUC <- barPlotCiError(as.matrix(AllRocAUC),
273273
274274
```
275275

276-
## Visualization of GDSTM
276+
## Visualization of UPSTM
277277

278-
The GDSTM is stored in the filteredFit() object. Hence, we can analyze and display the matrix.
278+
The UPSTM is stored in the filteredFit() object. Hence, we can analyze and display the matrix.
279279

280280
```{r results = "asis", warning = FALSE, dpi=600, fig.height= 6.0, fig.width= 8.0}
281281
282-
gplots::heatmap.2(mNBDecor$GDSTM,
282+
gplots::heatmap.2(mNBDecor$UPSTM,
283283
trace = "none",
284284
mar = c(10,10),
285285
col=rev(heat.colors(7)),
286-
main = paste("GDSTM Matrix (Pearson, LM):",studyName),
286+
main = paste("UPSTM Matrix (Pearson, LM):",studyName),
287287
cexRow = 0.7,
288288
cexCol = 0.7,
289289
key.title=NA,
290290
key.xlab="beta",
291-
xlab="GDSTM Feature", ylab="Input Feature")
291+
xlab="UPSTM Feature", ylab="Input Feature")
292292
293-
gplots::heatmap.2(mNBDecor2$GDSTM,
293+
gplots::heatmap.2(mNBDecor2$UPSTM,
294294
trace = "none",
295295
mar = c(10,10),
296296
col=rev(heat.colors(7)),
297-
main = paste("GDSTM Matrix (Spearman, RLM):",studyName),
297+
main = paste("UPSTM Matrix (Spearman, RLM):",studyName),
298298
cexRow = 0.7,
299299
cexCol = 0.7,
300300
key.title=NA,
301301
key.xlab="beta",
302-
xlab="GDSTM Feature", ylab="Input Feature")
302+
xlab="UPSTM Feature", ylab="Input Feature")
303303
```
304304

305305
## Repeated Holdout Cross-Validation
@@ -378,8 +378,8 @@ The Aggregated Test Results
378378
par(mfrow=c(2,2))
379379
bpraw <- predictionStats_binary(cvNBRaw$testPredictions,"NB RAW",cex=0.70)
380380
bpPCA <- predictionStats_binary(cvNBPCA$testPredictions,"NB PCA",cex=0.70)
381-
bpdecor <- predictionStats_binary(cvNBDecor$testPredictions,"NB GDSTM",cex=0.70)
382-
bpdecorC <- predictionStats_binary(cvNBDecorC$testPredictions,"NB GDSTM Outcome Driven",cex=0.70)
381+
bpdecor <- predictionStats_binary(cvNBDecor$testPredictions,"NB UPSTM",cex=0.70)
382+
bpdecorC <- predictionStats_binary(cvNBDecorC$testPredictions,"NB UPSTM Outcome Driven",cex=0.70)
383383
384384
pander::pander(bpraw$aucs)
385385
pander::pander(bpPCA$aucs)
@@ -590,18 +590,18 @@ pander::pander(psPCA$aucs)
590590
AllRocAUC <- rbind(AllRocAUC,psPCA$aucs)
591591
592592
psDecor <- predictionStats_binary(cbind(classoutcomes,prDecor),
593-
"NB GDSTM",cex=0.75)
593+
"NB UPSTM",cex=0.75)
594594
pander::pander(psDecor$aucs)
595595
AllRocAUC <- rbind(AllRocAUC,psDecor$aucs);
596596
597597
598598
psDecor2 <- predictionStats_binary(cbind(classoutcomes,prDecor2),
599-
"NB GDSTM Spearman",cex=0.75)
599+
"NB UPSTM Spearman",cex=0.75)
600600
pander::pander(psDecor2$aucs)
601601
AllRocAUC <- rbind(AllRocAUC,psDecor2$aucs);
602602
603603
psDecorD <- predictionStats_binary(cbind(classoutcomes,prDecorD),
604-
"NB GDSTMD Spearman",cex=0.75)
604+
"NB UPSTMD Spearman",cex=0.75)
605605
pander::pander(psDecorD$aucs)
606606
AllRocAUC <- rbind(AllRocAUC,psDecorD$aucs);
607607
@@ -620,19 +620,19 @@ AllRocAUC <- rbind(AllRocAUC,psPCA$aucs)
620620
621621
psDecor <- predictionStats_binary(cbind(classoutcomes,
622622
predict(mLASSODecor,datasetframe_test)),
623-
"LASSO GDSTM",cex=0.75)
623+
"LASSO UPSTM",cex=0.75)
624624
pander::pander(psDecor$aucs)
625625
AllRocAUC <- rbind(AllRocAUC,psDecor$aucs);
626626
627627
psDecorD <- predictionStats_binary(cbind(classoutcomes,
628628
predict(mLASSODecorD,datasetframe_test)),
629-
"LASSO GDSTMD",cex=0.75)
629+
"LASSO UPSTMD",cex=0.75)
630630
pander::pander(psDecorD$aucs)
631631
AllRocAUC <- rbind(AllRocAUC,psDecorD$aucs);
632632
633633
psDecor2 <- predictionStats_binary(cbind(classoutcomes,
634634
predict(mLASSODecor2,datasetframe_test)),
635-
"LASSO GDSTM Spearman",cex=0.75)
635+
"LASSO UPSTM Spearman",cex=0.75)
636636
pander::pander(psDecor2$aucs)
637637
AllRocAUC <- rbind(AllRocAUC,psDecor2$aucs);
638638
@@ -642,8 +642,8 @@ AllRocAUC <- rbind(AllRocAUC,psDecor2$aucs);
642642

643643
```{r results = "asis", warning = FALSE, dpi=600, fig.height= 6.0, fig.width= 8.0}
644644
645-
rownames(AllRocAUC) <- c("NB:Raw","NB:PCA","NB:GDSTM_P","NB:GDSTMD_P","NB:GDSTM_S",
646-
"LASSO:Raw","LASSO:PCA","LASSO:GDSTM_P","LASSO:GDSTMD_P","LASSO:GDSTM_S")
645+
rownames(AllRocAUC) <- c("NB:Raw","NB:PCA","NB:UPSTM_P","NB:UPSTMD_P","NB:UPSTM_S",
646+
"LASSO:Raw","LASSO:PCA","LASSO:UPSTM_P","LASSO:UPSTMD_P","LASSO:UPSTM_S")
647647
pander::pander(AllRocAUC)
648648
bpROCAUC <- barPlotCiError(as.matrix(AllRocAUC),
649649
metricname = "ROCAUC",

0 commit comments

Comments
 (0)