🔬 Research: Transformers with JAX and Flax - T5, BERT, RoBERTa and more #5167
Unanswered
8bitmp3
asked this question in
Show and tell
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Just wanted to highlight some work made by others, in case someone has missed it (a cross-post from
google/flax/discussions
:T5 (Text-To-Text Transfer Transformer)
As demoed by the JAX team (@skye @jekbradbury) at NeurIPS 2020 this week, T5 (Text-To-Text Transfer Transformer) JAX code is available: https://github.com/google-research/google-research/tree/master/flax_models/t5x - built with Flax.
HuggingFace Transformers
And, in case someone has missed the November 2020 announcement by @avital:
https://twitter.com/avitaloliver/status/1326986383983063058
https://twitter.com/MorganFunto/status/1326988188469039104
Beta Was this translation helpful? Give feedback.
All reactions