Laptops are compromising for AI with nothing to show for it

The best laptops and laptop brands of every stripe are all-in on AI. Compared with a year ago, the best laptops look like they’ve been overrun with NPUs and a new generation of processors (all of which are eager to integrate AI into every aspect of our day-to-day). More than two years into the artificial intelligence revolution, there’s not much to see.

Qualcomm’s long-awaited Snapdragon X Elite chips have made their way into Copilot+ laptops from the two companies that made them. Now AMD is here with Ryzen AI 300 chips. And eventually Intel Lunar Lake CPUs will appear. More and more, though, it’s apparent that these processors aren’t shovel-ready for an AI ‘future’, not the shovel-ready for the requirements of today – and you can’t typically have both.

It comes down to space

An AMD Ryzen CPU socketed in a motherboard.
Jacob Roach / Pro Well Tech

There’s an enormously consequential aspect of chip design that never gets mentioned – something the electronics enthusiasts on hardware forums have long intuitively grasped. Space is important.

Get your weekly teardown of the tech behind PC gaming

But for everyone else, you don’t really think about that. Chips with that much raw power – and therefore that much power usage and power heat – companies such as AMD and Intel could do that, but they don’t. A lot of the art of chip design is about, alright, you can burn 50 watts in here, therefore I’m going to figure out how to burn 51 watts in here.

But remember this: adding hardware onto a wafer isn’t free – it takes space away from something. Take a look at the annotated die shot of a Ryzen AI 300 CPU below. See the XDNA NPU in upper right? It’s the smallest of the three on these 3d2 chips – speculation puts it as 14mm2. AMD could use that space for more cores, or more likely, additional L3 Infinity Cache for the GPU.

Annotated! 😁

Funny how the RDNA WGP scalar units keep getting moved around, now back to RDNA1 layout.

The SRAM around the IMC is also present on PHX2 (and presumably PHX1), but seemingly not the desktop IOD or older CPUs/APUs.

Overall… neat, without any major surprises. https://t.co/cf6MZVMgT2 pic.twitter.com/MfPqQRDGcY

— Felt (@GPUsAreMagic) July 29, 2024

This is not to single out AMD, not to say that Ryzen AI 300 CPUs simply suck … except they do. As you can read in our Asus Zenbook S 16 review They don’t. AMD, Intel and Qualcomm make continuous trade-offs in design to get everything to fit on the chip, and it’s just not that simple. You pull one lever and other variables on the order of thousands are influenced, and everything must be brought into balance.

But it does serve to illustrate that the addition of an NPU to a die isn’t something that chip designers can just do and have no trade-offs somewhere else. And for the time being, these NPUs are largely wasted silicon; even apps that are accelerated by way of AI would rather get the compute power of the integrated GPU, and if you do have a dedicated GPU, it will likely blow the NPU away. There are some applications that afford a use case for an NPU, but for the vast majority of people out there, then, again, the so-called NPU is really used just to kind-of (OK…) do that background blur.

Ryzen AI 300 is the only real-world example we have at the moment, but Intel’s forthcoming Lunar Lake chips will be similarly ensnared. AMD and Intel are jockeying for Microsoft’s Copilot+ stamp of approval in its PCs, so both are including the kind of NPUs that can marginally meet Microsoft’s seemingly arbitrary criteria. AMD and Intel had already started including AI co-processors on their chips with no discernible benefits – but that co-processing is now irrelevant, now that the bar is set even higher.

It is undoubtedly an open question whether, in the absence of the Copilot+ push, AMD and Intel would have designed their processors in such a way. For now, we have this piece of silicon that probably doesn’t have much of a role to play with Ryzen AI 300 and one day with Lunar Lake. We are also reminded of Intel’s push with Meteor Lake that is also due to become obsolete in the face of Copilot+ requirements.

Promised AI features

Microsoft introducing the Recall feature in Windows 11.
Luke Larsen / Pro Well Tech

AMD and Intel have both made the same promise; they will all come under the Copilot+ umbrella eventually. Already, Microsoft has approved some chips from Qualcomm, whose Snapdragon X Elite series can run Copilot+ but the software has no support for other makers, at least not yet. AMD at least promises that its chip will be able to access Copilot+ for Microsoft. The other issue is that there are no Copilot+ features.

Before Copilot+ even launched, Recall became the marquee feature of the event, yet no person outside of the press has been able to use it: Microsoft delayed it, restricted it to Windows Insiders, and by the time Copilot+ PCs were actually ready to go on sale, well … they pushed it out until indefinitely, too. AMD and Intel might both make Copilot+ available before the end of the year, but … so what? We don’t have anything else local to go with it. We don’t have any additional local AI features.

And we are witnessing the legacy of Microsoft’s grip on PC silicon. A portfolio of new chips, from Qualcomm and from AMD (with Intel not far behind), with a chunk of silicon, roughly a quarter of a high-end processor, doing almost nothing despite the hype. This feels like another Bing Chat scenario – but this time it matters. I’m not even sure Microsoft is truly, fully, committed to the platform, never mind the fact that the key feature that’s actually selling Copilot+ PCs is not AI features, but battery life.

By next year, half a billion AI-capable laptops are forecast to ship; by 2027 they will likely account for more than half of all PC shipments. It’s obvious why Microsoft and the broader PC ecosystem is charging in, headlong, into the world of AI. But it’s hardly obvious that the products they are creating today are quite as essential as Microsoft, Intel, AMD and Qualcomm would like us believe.

Laying the groundwork

The AMD logo on the Asus Zenbook S 16.
Jacob Roach / Pro Well Tech

It’s just as vital to ask why here, though. We have the classic chicken-and-egg AI PC problem still, and even with Copilot+ and the Recall delay, that hasn’t changed. That hasn’t stopped Intel, AMD and Qualcomm from preparing the PC for AI applications that won’t exist, at least as silicon islands, when these future machines are so seamlessly integrated with how we use a PC that we soon forget we have something inside it called an NPU. And it’s not such a crazy notion – after all, Apple also does this exact thing. And Apple Intelligence feels like a logical next step.

I think we will get there, they’re pouring too much money into AI for it not to become just a normal part of PCs. But I’m waiting to see when it becomes as necessary as we’ve been led to believe.

Related Posts