Skip to content
#

gpt

Here are 1,972 public repositories matching this topic...

RWKV (pronounced RwaKuv) is an RNN with great LLM performance, which can also be directly trained like a GPT transformer (parallelizable). We are at RWKV-7 "Goose". So it's combining the best of RNN and transformer - great performance, linear time, constant space (no kv-cache), fast training, infinite ctx_len, and free sentence embedding.

  • Updated Apr 7, 2025
  • Python

Improve this page

Add a description, image, and links to the gpt topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the gpt topic, visit your repo's landing page and select "manage topics."

Learn more