Top notebook brands are now vigorously embracing artificial intelligence technology. In just one year, top laptops have added next-generation neural processing units (NPUs) and processors. These hardware components are designed to integrate AI technology into our daily lives seamlessly. The problem is that the AI revolution has been going on for several years, but the actual effects are not that significant.
Now, the PC industry welcomes Qualcomm’s Snapdragon X Elite chip, which is integrated into the new Copilot AI processors in laptops. At the same time, AMD launched the Ryzen AI 300 series of chips. Soon, Intel’s Lunar Lake processors will join the fray.
But they don’t seem to be designed for the future of AI processors in laptops, but rather to meet current needs—and often don’t have the best of both worlds.
Are AI processors useful?
In the field of chip design, a key factor that is often overlooked is space. If you often visit some hardware forums or websites where enthusiasts gather, you will find how important space is for chip design.
However, for the average user, this may not be an issue you want to consider. Companies like AMD and Intel can make high-end chips, but they are not doing so this time around. A major challenge in chip design is cramming as many functions as possible into a limited space.
This is important because adding any hardware to a chip is not free, it means taking space elsewhere. I’m not specifically saying that AMD or Ryzen AI 300 CPUs are bad, they aren’t bad at all. AMD, Intel, and Qualcomm always have to make various trade-offs during the design process to fit everything they need into the chip. This cannot be achieved by simply adding some caching. Every change impacts many other parameters, and designers must balance them all.
This also shows that adding NPU to the chip is not something designers can easily do, and there is no need to make other compromises. At present, these NPUs are almost useless. Even those applications that support AI acceleration tend to use the computing power of integrated GPUs. If there is an independent GPU, it far exceeds the performance of the NPU. NPUs have some specific uses, but for most users, they mainly enhance the background blur function.
Ryzen AI 300 is one example, and Intel’s Lunar Lake chips will face similar issues. Both AMD and Intel are working toward certification for Microsoft’s Copilot+ PCs, which means they must include NPUs that meet some of Microsoft’s specific power standards. AMD and Intel had integrated AI co-processors into their chips before Copilot+, but now with new, higher requirements, those co-processors aren’t really that useful.
Without the impetus of Copilot+, it’s hard to say whether AMD and Intel would have designed their processors differently. But now it looks like we’re seeing a piece of silicon that’s barely functioning in Ryzen AI 300 and future Lunar Lake. This is reminiscent of Intel’s push for Meteor Lake, which is all but obsolete.
AI functions that stop at words
AMD and Intel will eventually join Copilot+, as they promised. Currently, Microsoft has approved only Qualcomm’s Snapdragon X Elite chip. However, AMD has stated that it will enable Copilot+ support on its chips before the end of the year. The problem is, that there aren’t any Copilot+ features available yet.
Recall has been the most talked-about feature since Microsoft announced Copilot+. However, no one outside the media has actually used it. Microsoft delayed the feature, and it was only available in Windows Insider builds. By the time Copilot+ PCs were ready to ship, Microsoft had delayed it indefinitely. AMD and Intel may join the Copilot+ bandwagon before the year ends, but the delay of the features will still make their participation less meaningful.
Microsoft’s influence on the PC industry is becoming more evident. New chips from Qualcomm and AMD are already available, with Intel joining soon. These chips feature AI processors in laptops, but they don’t deliver much yet. This rushed move feels similar to the Bing Chat rollout. It makes me wonder if Microsoft is as committed to this platform as they claim.
That said, the main focus of Copilot+ PCs isn’t the AI function but rather the longer battery life.
Experts predict that consumers will buy 500 million AI-enabled laptops in the coming years. By 2027, such computers will account for more than half of all PC shipments. This is why Microsoft and the PC industry are heavily promoting AI.
Which came first, the chicken or the egg?
Even so, it’s important to understand the current situation. The United States has encountered a typical “chicken or egg” problem in the field of AI PC. Even if Microsoft delays the launch of Copilot+ and Recall, the problem will still exist. Intel, AMD, and Qualcomm are working to lay the groundwork for possible future AI applications that they hope will integrate so seamlessly with the way the industry uses PCs that they won’t even realize they’re using NPUs. Apple has been doing this for years, and Apple Intelligence feels like a natural evolution of this trend.
The U.S. PC industry isn’t there yet. If you plan to invest in AI PCs, be prepared to be an early adopter. Most apps don’t fully utilize your NPU. Even those with native AI capabilities often run on your GPU. Plus, we’ve seen shifting targets with Copilot+ and the first wave of NPUs from AMD and Intel.
Over time, the global PC industry will reach that stage. There’s a significant investment in AI, and it will become a crucial part of personal computers. However, we don’t know if it will turn out as expected.
If you’re looking for more tech insights or deals on the latest gadgets, check out TinyDeals Blog for updates.