Image missing.
Four reasons to be optimistic about AI’s energy usage

Will Douglas Heaven

created: May 20, 2025, 9 a.m. | updated: May 22, 2025, 9:31 a.m.

Parallel computing underpins much of today’s software, especially large language models (GPUs are parallel by design). There is a lot of talk about small models, versions of large language models that have been distilled into pocket-size packages. In many cases, these more efficient models perform as well as larger ones, especially for specific use cases. As businesses figure out how large language models fit their needs (or not), this trend toward more efficient bespoke models is taking off. “There’s going to be a really, really large number of specialized models, not one God-given model that solves everything,” says Farhadi.

1 month ago: MIT Technology Review