top of page

It's Called the Exponential Function - and Humans Are Bad at it

  • 6 days ago
  • 2 min read

By Hira Fernando | May 2024 | Career Strategy


This is why your brain can't process what's happening with AI

Let me take you back to January 2020. A respiratory illness was spreading in Wuhan. A handful of people were paying attention. The rest of the world was... not. By mid-March, the entire planet had shut down. In a matter of weeks, everything changed.

The problem wasn't that people lacked information. It was that the information was growing exponentially, and human brains are simply not wired to track exponential growth in real time. We're wired for linear. We understand "it got a bit worse today." We do not instinctively understand "it doubled yesterday, and it's going to double again tomorrow."

A quick lesson from COVID.

In the early days, the case count looked reassuringly small. 100 cases. Fine. 200 cases. Still manageable. 400. 800. 1,600... By the time it hit numbers that felt alarming, it had already been doubling for weeks. The curve wasn't a gradual slope. It was flat, flat, flat — then nearly vertical. Almost overnight.

Epidemiologists were sounding the alarm in February. The public reacted in March. And the difference between those two moments — just a few weeks — meant the difference between containment and a global pandemic. We weren't caught off guard because we lacked the data. We were caught off guard because we didn't understand what the data meant.

AI is on the same curve

The progression of AI capability looks a lot like that COVID graph. For years, the improvements were real but felt incremental. Something helpful here, a slightly better output there. Easy to dismiss. Easy to deprioritise. And then, the curve bends.

Each new model isn't just "a bit better" than the last. The pace of improvement is itself accelerating. We're not moving up a gentle slope. We're approaching that near-vertical part of the curve, and most people's internal models for "how fast things are changing" are still calibrated for linear progress.

This is why the warning signs feel overblown until they suddenly don't. This is why "this seems like hype" is such a dangerous frame right now. COVID also "seemed like hype" in January 2020.


What the exponential function actually means for you


It means that the gap between today's AI capabilities and tomorrow's is not what you think it is. The professionals who are actively using AI tools right now are not just slightly ahead; they are compounding their advantage every few months as each new model release hands them additional capability.

And the professionals who are waiting — waiting to see if this is real, waiting until it becomes necessary, waiting until their employer mandates it — are falling further behind with every passing month. Not linearly. Exponentially.

I'm not saying this to frighten you (okay, maybe a little). I'm saying it because understanding the shape of the curve is the first step to doing something useful with that knowledge. The second step? We'll get to that. But it starts with trying the tools now — not later.

 
 
 

Comments


bottom of page