> # microgpt
>
> This is a brief guide to my new art project microgpt, a single file of 200 lines of pure Python with no dependencies that trains and inferences a GPT. This file contains the full algorithmic content of what is needed: dataset of documents, tokenizer, autograd engine, a GPT-2-like neural network architecture, the Adam optimizer, training loop, and inference loop. Everything else is just efficiency. I cannot simplify this any further.
by [Andrej_Karpathy]
https://karpathy.github.io/2026/02/12/microgpt/ ^[HN](https://news.ycombinator.com/item?id=47202708)^