Why BERT is suggesting to fine tune and not feature extraction? #59
Unanswered
manohar1989
asked this question in
Q&A
Replies: 1 comment 4 replies
-
Hey there, Do you mind sharing the link where you’re reading this so I can learn more too? There are many different models of BERT on TensorFlow Hub. Though it seems like you’ve tried it and it’s working? |
Beta Was this translation helpful? Give feedback.
4 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I searched TFHUB and found out that BERT has to be fine tuned. All params to be updated, but i did not understand why bert should be fine tuned.
The features, bidirectional meaning of words would remain same for english language atleast.
I tried with and without freezing the layers, the output when freezed seems better for my usecase.
Beta Was this translation helpful? Give feedback.
All reactions