Skip to content

Commit f466462

Browse files
authored
Remove optimizers (#103)
* Changes.. * tests/optimization now passes * Remove _optimizer.py * Fix config class for early stoppable recommenders. * max_epoch -> train_epochs for recommender's parameter. * Fix tests * Mv optimizer under recommenders/ * fixing examples * Improved example, fix import failure
1 parent af09fbf commit f466462

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

48 files changed

+3325
-3983
lines changed

.gitignore

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -27,3 +27,4 @@ pybind11
2727
docs/build/
2828
docs/source/api_reference/
2929
.cache/
30+
.coverage

Readme.md

Lines changed: 4 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -82,7 +82,7 @@ recommender.learn()
8282
recommender.get_score_remove_seen([0])
8383
```
8484

85-
## Step 2. Evaluate on a validation set
85+
## Step 2. Evaluation on a validation set
8686

8787
To evaluate the performance of a recommenderm we have to split the dataset to train and validation sets:
8888

@@ -120,15 +120,12 @@ This will print something like
120120
}
121121
```
122122

123-
## Step 3. Optimize the Hyperparameter
123+
## Step 3. Hyperparameter optimization
124124

125-
Now that we can evaluate the recommenders' performance against the validation set, we can use [optuna](https://github.com/optuna/optuna)-backed hyperparameter optimizer.
125+
Now that we can evaluate the recommenders' performance against the validation set, we can use [optuna](https://github.com/optuna/optuna)-backed hyperparameter optimization.
126126

127127
```Python
128-
from irspack import IALSOptimizer
129-
130-
optimizer = IALSOptimizer(X_train, evaluator)
131-
best_params, trial_dfs = optimizer.optimize(n_trials=20)
128+
best_params, trial_dfs = IALSRecommender.tune(X_train, evaluator, n_trials=20)
132129

133130
# maximal ndcg around 0.43 ~ 0.45
134131
trial_dfs["ndcg@10"].max()

docs/source/api_reference.rst

Lines changed: 0 additions & 36 deletions
Original file line numberDiff line numberDiff line change
@@ -57,42 +57,6 @@ which requires ``jax``, ``jaxlib``, ``dm-haiku``, and ``optax``:
5757

5858
MultVAERecommender
5959

60-
61-
.. currentmodule:: irspack.optimizers
62-
63-
Optimizers
64-
-----------
65-
.. autosummary::
66-
:toctree: api_reference
67-
:nosignatures:
68-
69-
BaseOptimizer
70-
TopPopOptimizer
71-
IALSOptimizer
72-
P3alphaOptimizer
73-
RP3betaOptimizer
74-
TruncatedSVDOptimizer
75-
CosineKNNOptimizer
76-
AsymmetricCosineKNNOptimizer
77-
JaccardKNNOptimizer
78-
TverskyIndexKNNOptimizer
79-
CosineUserKNNOptimizer
80-
AsymmetricCosineUserKNNOptimizer
81-
SLIMOptimizer
82-
DenseSLIMOptimizer
83-
MultVAEOptimizer
84-
get_optimizer_class
85-
86-
87-
Autopilot
88-
---------
89-
.. autosummary::
90-
:toctree: api_reference
91-
:nosignatures:
92-
93-
autopilot
94-
95-
9660
.. currentmodule:: irspack.split
9761

9862
Split Functions

0 commit comments

Comments
 (0)