NPUs are officially useless – for now

Jacob Roach / Digital Trends

I’m sorry to say it, but I’d be lying if I said otherwise. NPUs are useless. Maybe this seems harsh, but it doesn’t come from a place of cynicism. After testing them myself, I’m really convinced there is no compelling use for the NPU right now.

When you go to buy your next laptop, there’s a good chance it will have a Neural Processing Unit (NPU) inside it, and AMD and Intel have been making a lot of noise about how these NPUs will drive the future of AI computing. I have no doubt we’ll get there eventually, but from a modern use case perspective, the NPU really isn’t as efficient as you’d think.

When talking about Meteor Lake at CES 2024, Intel explained that it uses the different processors on Meteor Lake chips to handle AI. Not everything falls on the NPU. In fact, the most demanding AI workloads are offloaded onto the integrated Arc GPU. The NPU is mainly there as a low-power option for stable, low-intensity AI workloads like background blurring.

Jacob Roach / Digital Trends

Your GPU kicks in for anything more intense. For example, Intel showed a demo in Audacity where AI can separate the audio tracks from a single stereo file and even transcribe the lyrics. It also showed the AI ​​at work, with Unreal Engine’s Metahuman transforming video footage into an animation of a realistic game character. Both ran on the GPU.

These are some great AI use cases. They just don’t need an NPU. A dedicated AI processor is not as powerful as a GPU for those heavy AI workloads.

The main use case, according to Intel, is that the NPU is efficient, so it should help your battery life when you’re running those stable, low-intensity AI workloads, like background blur, rather than on the GPU. I tested this on an MSI Studio 16 running one of Intel’s new Meteor Lake processors, and it’s not as big of a power saver as you’d expect.

Jacob Roach / Digital Trends

Over the course of 30 minutes, the GPU averaged 18.9W, while the NPU averaged 17.6W. What was interesting was that the GPU initially ramped up, reaching somewhere around 38W total system-on-chip power. but slowly crawled back down. The NPU, on the other hand, started low and slowly rose over time, eventually settling between 16W and 17W.

I should note that this is total package power — in other words, the power of the entire chip, not just the GPU or NPU. The NPU is more efficient, but I don’t know how much that efficiency matters in these low-intensity AI workloads. It probably doesn’t save you much battery life in actual use.

I’ll have to wait to test that battery life after I’ve had more time with one of these Meteor Lake laptops, but I don’t suspect it’ll make a huge difference. Even the best Windows laptops have pretty mediocre battery life compared to the competition from Apple, and I doubt the NPU is enough to change that dynamic.

Jacob Roach / Digital Trends

This shouldn’t distract from the exciting AI applications we’re starting to see on PCs. From Audacity to Adobe’s suite to Stable Diffusion in GIMP, there are now many ways you can use AI. However, most of these applications are handled either in the cloud or by your GPU, and the dedicated NPU doesn’t do much beyond blurring the background.

As NPUs become more powerful and these AI applications more efficient, I have no doubt that NPUs will find their place. And it makes sense that AMD and Intel are laying the groundwork for that to happen in the future. As it stands, however, much of the focus on AI performance is largely focused on your GPU rather than the efficient AI processor driving your video calls.

Editors’ recommendations






Leave a Comment

Your email address will not be published. Required fields are marked *