Skip to content

Commit 7f73d88

Browse files
committed
Update README.md
1 parent c8fcd5b commit 7f73d88

File tree

1 file changed

+2
-3
lines changed

1 file changed

+2
-3
lines changed

README.md

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -510,8 +510,7 @@ Our model achieves the following performance:
510510
>📋 Include a table of results from your paper, and link back to the leaderboard for clarity and context. If your main result is a figure, include that figure and link to the command or notebook to reproduce it.
511511
512512
## Future work
513-
- Since the huge size of the para dataset (comparing) to both of the sizes of the sst and sts datasets is leading to overfitting, then an enlargemnt of the sizes of the datasets sst and sts should reduce the possibilty of overfitting.
514-
This could be achieved be generating more (true) data from the datasets sst and sts, which is possible by adding another additional Task.
513+
- Since the huge size of the para dataset (comparing) to both of the sizes of the sst and sts datasets is leading to overfitting, then an enlargemnt of the sizes of the datasets sst and sts should reduce the possibilty of overfitting. This could be achieved be generating more (true) data from the datasets sst and sts, which is possible by adding another additional Task, see issue #60 for more details.
515514
- give other losses different weights.
516515
- with or without combined losses.
517516
- maybe based in dev_acc performance in previous epoch.
@@ -543,4 +542,4 @@ python -u multitask_classifier.py --use_gpu --option finetune --optimizer "soph
543542
To train the sophia model with weighted loss and seperate paraphrasing training run:
544543
```
545544
python -u multitask_classifier.py --use_gpu --option finetune --optimizer "sophiag" --epochs 5 --weights True --para_sep True --hidden_dropout_prob_para 0 --hidden_dropout_prob_sst 0 --hidden_dropout_prob_sts 0 --lr_para 1e-05 --lr_sst 1e-05 --lr_sts 1e-05 --batch_size 64 --optimizer "sophiag" --weight_decay_para 0.1267 --weight_decay_sst 0.2302 --weight_decay_sts 0.1384 --rho_para 0.0417 --rho_sst 0.0449 --rho_sts 0.0315 --comment weighted_loss_without_dropout
546-
```
545+
```

0 commit comments

Comments
 (0)