PyTorch implementation of a sequence-to-sequence neural machine translation model with attention mechanism. Includes data preprocessing, custom dataset, encoder-decoder with bidirectional LSTM, attention, training loop, and inference to translate English sentences to Russian.
-
Notifications
You must be signed in to change notification settings - Fork 0
PyTorch implementation of a sequence-to-sequence neural machine translation model with attention mechanism. Includes data preprocessing, custom dataset, encoder-decoder with bidirectional LSTM, attention, training loop, and inference to translate English sentences to Russian.
ArevikKH/Machine-Translate-Seq2Seq-Attention
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
PyTorch implementation of a sequence-to-sequence neural machine translation model with attention mechanism. Includes data preprocessing, custom dataset, encoder-decoder with bidirectional LSTM, attention, training loop, and inference to translate English sentences to Russian.
Topics
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published