The Unstoppable Rise of AI: Exponential Growth Redefines Progress

We evolved for a linear world, but AI and its core exponential trends are redefining our understanding of progress. From 2010 to now, the amount of training data for frontier AI models has grown by a staggering 1 trillion times, from roughly 10¹⁴ flops to over 10²⁶ flops.

This explosion is driven by three key advances: faster calculators, thanks to chips like Nvidia’s and Maia 200; high bandwidth memory (HBM) that feeds data to processors at incredible speeds; and technologies like NVLink and InfiniBand that connect hundreds of thousands of GPUs into warehouse-size supercomputers.

These gains have led to dramatically more compute power, with training times for language models decreasing from 167 minutes to under four minutes on equivalent modern hardware. This 50x improvement far surpasses the 5x predicted by Moore’s Law, proving that AI development is unlikely to hit a wall anytime soon.

Photo by Tima Miroshnichenko on Pexels
Photos provided by Pexels