Skip to content

Releases: usamahz/transformer

Transformer-v1.0

22 Sep 23:11
Compare
Choose a tag to compare

Transformer from Scratch

The Transformer architecture, introduced in the paper "Attention Is All You Need," has become a cornerstone of many natural language processing tasks, this project implements a Transformer model from scratch using PyTorch.

Model Architecture

The Transformer model consists of the following components:

  1. Encoder
  2. Decoder
  3. Multi-Head Attention
  4. Position-wise Feed-Forward Networks
  5. Positional Encoding

Full Changelog: https://github.com/usamahz/transformer/commits/v1.0