This repository contain various types of attention mechanism like Bahdanau , Soft attention , Additive Attention , Hierarchical Attention etc in Pytorch, Tensorflow, Keras
-
Updated
Sep 23, 2021 - Python
This repository contain various types of attention mechanism like Bahdanau , Soft attention , Additive Attention , Hierarchical Attention etc in Pytorch, Tensorflow, Keras
Chapter 9: Attention and Memory Augmented Networks
Configurable Encoder-Decoder Sequence-to-Sequence model. Built with TensorFlow.
Tensorflow 2.0 tutorials for RNN based architectures for textual problems
Image Captioning is the process of generating textual description of an image. It uses both Natural Language Processing and Computer Vision to generate the captions.
A multi-layer bidirectional seq-2-seq chatbot with bahdanau attention.
Seq2Seq model implemented with pytorch, using Bahdanau Attention and Luong Attention.
Generate captions from images
Master Project on Image Captioning using Supervised Deep Learning Methods
Implementation of GRU-based Encoder-Decoder Architecture with Bahdanau Attention Mechanism for Machine Translation from German to English.
Bangla Conversational Chatbot using Bidirectional LSTM with Attention Mechanism
Neural Machine Translation by Jointly Learning to Align and Translate paper implementation
A language translator based on a very simple NLP Transformer model, backed by encoder, decoder and a Bahdanau Attention Layer in between, implemented on TensorFlow.
s-atmech is an independent Open Source, Deep Learning python library which implements attention mechanism as a RNN(Recurrent Neural Network) Layer as Encoder-Decoder system. (only supports Bahdanau Attention right now).
Implemented an Encoder-Decoder model in TensorFlow, where ResNet-50 extracts features from the VizWiz-Captions image dataset and a GRU with Bahdanau attention generates captions.
This repository contains an implementation of a neural text simplification model that combines sequence-to-sequence learning with reinforcement learning and lexical-semantic loss. The model aims to simplify complex text while maintaining meaning and grammatical correctness.
Sequence 2 Sequence with Attention Mechanisms in Tensorflow v2
Solution for the Quora Insincere Questions Classification Kaggle competition.
Caption Images with Machine Learning
Neural Machine Translation (NMT) with pivot and triangulation approaches
Add a description, image, and links to the bahdanau-attention topic page so that developers can more easily learn about it.
To associate your repository with the bahdanau-attention topic, visit your repo's landing page and select "manage topics."