FastConformer-Longformer #9991
Unanswered
GrigorKirakosyan
asked this question in
Q&A
Replies: 2 comments
-
we released 1B model but not the large model here: https://huggingface.co/spaces/nvidia/parakeet-tdt_ctc-1.1b |
Beta Was this translation helpful? Give feedback.
0 replies
-
Thanks, I have seen this. I was interested to know about Large-FastConformer(~120M) model trained with this config: |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi NeMo team,
Do you plan to release En Large FastConformer-Long-CTC-BPE model trained with Local Attention and Global token?
Beta Was this translation helpful? Give feedback.
All reactions