Skip to content

apoorv19151/transformer-implementation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 

Repository files navigation

Transformer Architecture Implementation

A simple TensorFlow/Keras implementation of the Transformer from "Attention is All You Need".

Features

  • Fully functional Encoder-Decoder architecture
  • Masked self-attention and encoder-decoder attention
  • Feed-forward networks, embeddings, and positional encodings
  • Unit tests for verifying layer outputs

Work in Progress – this implementation is still under development.

References

About

Simple TensorFlow/Keras Transformer with Encoder-Decoder and attention mechanisms.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published