Skip to content
#

generative-pretrained-transformers

Here are 2 public repositories matching this topic...

Language: All
Filter by language

ToyGPT, inspired by Andrej Karpathy’s GPT from scratch, creates a toy generative pre-trained transformer at its most basic level using a simple bigram language model with attention to help educate on the basics of creating an LLM from scratch.

  • Updated Nov 28, 2024
  • Jupyter Notebook

echoGPT is a minimal GPT implementation for character-level language modeling with 25.4M parameters. Built with PyTorch, it includes multi-head self-attention, feed-forward layers, and position embeddings. Trained on text like tiny_shakespeare.txt to predict the next character.

  • Updated Jan 11, 2025
  • Python

Improve this page

Add a description, image, and links to the generative-pretrained-transformers topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the generative-pretrained-transformers topic, visit your repo's landing page and select "manage topics."

Learn more