Skip to content

PyTorch implementation of a sequence-to-sequence neural machine translation model with attention mechanism. Includes data preprocessing, custom dataset, encoder-decoder with bidirectional LSTM, attention, training loop, and inference to translate English sentences to Russian.

Notifications You must be signed in to change notification settings

ArevikKH/Machine-Translate-Seq2Seq-Attention

Repository files navigation

Machine-Translate-Seq2Seq-Attention

PyTorch implementation of a sequence-to-sequence neural machine translation model with attention mechanism. Includes data preprocessing, custom dataset, encoder-decoder with bidirectional LSTM, attention, training loop, and inference to translate English sentences to Russian.

About

PyTorch implementation of a sequence-to-sequence neural machine translation model with attention mechanism. Includes data preprocessing, custom dataset, encoder-decoder with bidirectional LSTM, attention, training loop, and inference to translate English sentences to Russian.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published