Image missing.
The AI Industry’s Scaling Obsession Is Headed for a Cliff

Will Knight

created: Oct. 15, 2025, 6 p.m. | updated: Oct. 19, 2025, 11:48 a.m.

A new study from MIT suggests the biggest and most computationally intensive AI models may soon offer diminishing returns compared to smaller models. By mapping scaling laws against continued improvements in model efficiency, the researchers found that it could become harder to wring leaps in performance from giant models whereas efficiency gains could make models running on more modest hardware increasingly capable over the next decade. Thompson says the results show the value of honing an algorithm as well as scaling up compute. The study is particularly interesting given today’s AI infrastructure boom (or should we say “bubble”? OpenAI and other US tech firms have signed hundred-billion-dollar deals to build AI infrastructure in the United States.

5 days, 3 hours ago: WIRED