Product announcements are almost always accompanied by the introduction of new AI features this year. Virtually every new laptop will be equipped with AI, although this can manifest itself in a vast number of ways. It looks like the demand for AI is causing widespread change when it comes to hardware.
AI as it is popularly known, mostly runs in the cloud, especially well-known applications such as ChatGPT, Stable Diffusion and Windows Copilot. However, as Intel calls it, many parties want to move to client AI, where the end user carries the hardware required with them. In fact, the user should not have to depend on the cloud to run an AI-accelerated operating system. Indeed, there is no room at all in the long run for billions of consumers tapping data centers to run AI.
Tip: An AI PC revolution: Dell and Intel see it, AMD not yet
The solution lies with efficient processors specifically suited to AI tasks. These can already be found in all kinds of smartphones, desktops and laptops. This autumn, the trend toward AI-focused hardware is evident in new laptops, with or without a dedicated coprocessor. Locally-run AI should provide better latency, more user privacy and less power consumption.
Limited AI: webcams clean up, image editing
An example of AI features can be found in the new line of HP laptops. This year’s HP ENVY ships with an Intel Movidius VPU AI Accelerator. In its marketing, the focus is particularly on improving the user experience with regard to video conferencing, such as keeping a face in the frame, better background filtering or dynamic audio quality cleanup. The bar is also not set too high when announcing the AI features in the Chromebook Plus: Magic Eraser cleverly removes people or objects from photos, and File Sync keeps track of which files a user wants to see appear quickly. The Surface Laptop Studio 2 is also equipped with a specific AI accelerator to run such features more smoothly than before.
In other words: on debut, the newly accelerated AI abilities of new laptops do not sound groundbreaking. After all, a Google feature like Magic Eraser was introduced in Pixel smartphones more than a year ago. More meaningful features as found in Windows Copilot, for example, have yet to step away from the cloud as a platform to run on. Will that change?
Next up: Meteor Lake, VPUs and local inferencing
VP & GM Client AI at Intel John Rayfield believes there’s plenty of development ahead. In December, the first laptops will arrive equipped with Meteor Lake processors, the latest Intel chips and the first generation to include VPUs in the design. Unlike the Movidius chip, this is an integrated component within the processor that should be much more capable. This “inference accelerator” is designed to accelerate inferencing, i.e. the running of a generative AI model to generate a new output. What’s crucial here is that such a process is efficient, so that even a laptop can deploy it while running on battery power.
This is a step beyond what is already possible today: offloading AI tasks to the GPU or simply running them on the CPU. Rayfield explains that 75 percent of AI is a software problem; the last quarter is about optimization and acceleration. So there are standards like ONNX for AI models, APIs like DirectML and WebNN for Web applications. These allow developers to dynamically use what an end user has on hand: thus, a new laptop with AI acceleration will automatically deploy the appropriate hardware resources to a specific task. He does say that particularly heavy AI tasks require 10 to 50 times the computing power compared to previous applications.
That feat explains why the AI capabilities of laptops in the year 2023 do not immediately capture the imagination. Running an LLM locally on a thin-and-light is not going to be practical anytime soon. However, the building blocks have been laid for this to be possible someday, once the hardware has made the necessary improvements. In the meantime, laptops, PCs and smartphones will gradually be able to adopt more and more AI features from the cloud.