Tips for Unbalanced Datasets #548
Unanswered
lwachowiak
asked this question in
Q&A
Replies: 1 comment
-
I had better experiences with weighting the loss (weighting the samples themselves or the classes), or changing it to another loss such as the focal loss. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I am working with an unbalanced dataset (class A 84% of samples, class B 11%, class C 5%).
So far the accuracy of the trained model does not exceed that of a simple majority classifier by much. In your experience, what sampling techniques led improved results? My training set also has not a huge amount of samples (4k generated with a sliding window).
Did you have success with techniques like oversampling, under sampling, or SMOTE?
Did a certain model type perform best with highly unbalanced classes?
Thanks for any tips in advance!
Beta Was this translation helpful? Give feedback.
All reactions