Five years of tinygrad
5.8 Key Insight: The path to competing with NVIDIA isn't building better chips—it's building a fully sovereign, minimal software stack, because software is the actual moat.
George Hotz reflects on five years of building tinygrad, a minimal deep learning framework that now stands at 18,935 lines of code with a team of 6 people. He argues that the path to competing with NVIDIA is through software sovereignty, not hardware—pointing out that AMD, Amazon, Tesla, and Groq have all made good chips but failed at training because they lack the software stack. Tinygrad is working toward zero dependencies and LLVM removal to drive AMD GPUs with pure Python. Hotz critiques mainstream software development as 98% workarounds for other code, advocating for Elon's principle that 'the best part is no part.' The tiny corp operates as a 'deconstructed company' with public Discord, GitHub, AMD contracts negotiated on Twitter, and a mission to commoditize the petaflop.
7 Only a fool begins by taping out a chip; it's expensive and not the hard part.
7 Once you have a fully sovereign software stack capable of training SOTA models, the chip is so easy.
6 I think this is so bad that 98% of lines of software are basically this in some way shape or form.
tinygradHardware