You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The results of UniFew on fewshot are very impressive. The current UniFew is using UnifiedQA pretrained model (T5 or BART based). Because our pretrained models are BERT-based, I have several questions:
Do you have plan to support UniFew with BERT (or RoBERTa) pretrained models?
If not, what's required to make UniFew to work with BERT-like pretrained models?
Thanks,
Liwei