Releases: usamahz/transformer
Releases · usamahz/transformer
Transformer-v1.0
Transformer from Scratch
The Transformer architecture, introduced in the paper "Attention Is All You Need," has become a cornerstone of many natural language processing tasks, this project implements a Transformer model from scratch using PyTorch.
Model Architecture
The Transformer model consists of the following components:
- Encoder
- Decoder
- Multi-Head Attention
- Position-wise Feed-Forward Networks
- Positional Encoding
Full Changelog: https://github.com/usamahz/transformer/commits/v1.0