Blog #0190: What If Numbers Could Remember?

First of a 4-part series building a GPT from scratch in Elixir -- pure functional, no external dependencies. Part 1 tackles automatic differentiation: numbers track their own computational history in a graph, so gradients come from backpropagation rather than hand-derived formulas. Nine modules, organized by dependency, with side effects confined to the top-level module. It's Karpathy's 200-line Python MicroGPT translated into idiomatic Elixir, optimized for teaching rather than brevity. See also: Part 2 - From Letters to Logits (https://matthewsinclair.com/blog/0191-from-letters-to-logits), Part 3 - How Tokens Talk to Each Other (https://matthewsinclair.com/blog/0192-how-tokens-talk-to-each-other), Part 4 - Learning and Dreaming (https://matthewsinclair.com/blog/0193-learning-and-dreaming).

Visit Original Article →

⌘K

Start typing to search...

Search across content, newsletters, and subscribers