What will a next-gen AI PC look like? Intel confirms necessary power for the NPU – it’ll be fast enough to run Copilot locally

Click here to visit Original posting

Microsoft’s Copilot will be able to run locally on your AI PC in the future, it has been confirmed, as have the hardware requirements in terms of how powerful an NPU (Neural Processing Unit) will be required in these AI-focused devices.

Tom’s Hardware reports that at Team Blue’s AI Summit in Taipei, in a Q&A session, Todd Lewellen, VP of Intel’s Client Computing Group, confirmed to our sister site that the NPU of a next-gen AI PC will need to reach 40 TOPS (a measurement of processing power in relation to AI tasks).

Note that this is the NPU of the next generation of AI PCs – an army of devices on the horizon, apparently – not the current-gen devices which operate with much lower TOPS than that (we’ll come back to that momentarily).

As mentioned, it was further confirmed that in the future Microsoft’s Copilot AI will be able to run locally on the AI PC – which this additional NPU processing power will facilitate, as opposed to needing to be online and tap the power of the cloud for the AI assistant’s responses. Or at least some, or perhaps a great deal, of Copilot functionality will be handled locally.

Lewellen clarified: “And as we go to that next gen [AI PC], it’s just going to enable us to run more things locally, just like they will run Copilot with more elements of Copilot running locally on the client. That may not mean that everything in Copilot is running local, but you’ll get a lot of key capabilities that will show up running on the NPU.”


Analysis: Fairly local

Running Copilot locally (for at least some, or indeed many, of the AI’s functions) means benefiting from faster responses, as having the workload carried out on the laptop itself will mean the AI is nice and snappy.

While the cloud is great for heavy lifting, of course, it entails having to wait for things to happen remotely (and is dependent on the whims of your internet connection, as ever, which might be spotty particularly when on the move with a notebook).

With much more powerful NPUs in the cards – and the incoming Snapdragon X Elite chip promising to deliver 45 TOPS of processing performance, in devices like the Surface Pro 10 due in the middle of the year – some big strides forward are about to be taken. To put this in perspective, Intel’s current Meteor Lake laptop silicon has an NPU offering around 10 TOPS (and Qualcomm has already made it very clear how fast its Snapdragon CPU is for AI workloads).

Working locally with Copilot, and not piping data into the cloud, is more secure for obvious reasons – it’s always better to avoid sending your data online, particularly if it’s sensitive in nature, if you can. And, of course, local processing is better for privacy, too (though if it’s tight privacy you want, feeding data into and interacting with an AI in any fashion is going to limit that goal, shall we say).

It’s also worth noting that elsewhere, Intel spilled some further beans on AI PCs and that they won’t just be required to have an NPU and Copilot installed, but also the dedicated Copilot key on the keyboard that Microsoft revealed back at the start of 2024. (Well, technically this is a slight gray area – but it’s certainly controversial and we chew over the broader concerns at length here).

A further rumor heard in the past is that AI PCs will require 16GB of system RAM, but that nugget hasn’t been confirmed yet. Again, this might be tied up in the aim of running Copilot, or at least much of it, locally on the laptop.

You might also like...