Google News
logo
From GPUs to NPUs: AI Evolution of Hardware in Smartphones
Last Updated : 11/27/2023 01:30:44

The integration of Artificial Intelligence (AI) into our daily lives, especially through smartphones, reflects the remarkable evolution in technological hardware.

From GPUs to NPUs: AI Evolution of Hardware in Smartphones
The integration of Artificial Intelligence (AI) into our daily lives, especially through smartphones, reflects the remarkable evolution in technological hardware. An exploration of Google Pixel 8 review would reveal the remarkable strides AI has made, permeating various applications and functionalities on smartphones today.


GPUs as AI processors

Before AI went mainstream, GPUs were designed to rapidly manipulate and alter memory to display images and animate graphics on a smartphone screen. Then when AI applications grew more common, GPUs found a new purpose – they could apply their simultaneous processing ability to run the complex mathematical models needed for AI.

Whether it was facial recognition, object detection in photography, or speech recognition in voice assistants, GPUs powered the latest AI capabilities in smartphones to an extent. Of course, they didn’t fully replace the role of the CPU in these functions, but together they reduced processing times significantly.

The need for specialised AI hardware :

However, GPUs had their limits when it came to AI performance. As neural networks and other machine learning models grew even larger and more complex, GPUs couldn’t keep up. Sure, they performed well running mathematical calculations in parallel, but they still weren’t designed exclusively with AI in mind.

And as networks ballooned in size, naive GPU parallelism hurt performance. Add the constraints of power consumption in smartphones, and it was clear specialised hardware would be needed.


Enter the NPU :

Dedicated AI hardware was necessary to advance beyond the GPU. The solution that emerged was the Neural Processing Unit or NPU – a microprocessor developed specifically for neural network machine learning. In a nutshell, an NPU’s speciality is crunching numbers for machine learning models.

In terms of design, the NPU is developed with a “data-driven parallel computing” architecture, which is particularly good at processing massive multimedia data such as video and images. It simulates human neurons and synapses at the circuit level and directly processes them with a deep-learning instruction set. This allows the NPU to complete the processing of a group of neurons with one instruction, making it highly efficient for AI computations.

While the main processor could technically run these too, it would consume a lot of power and choke up resources for other apps. Meanwhile, the NPU breezes through these complex calculations using ultra-efficient circuitry assisted by dedicated memory. It generates lightning-fast results for AI apps, freeing up the main chip to hum along smoothly. And by offloading the AI-related heavy lifting, it also saves battery life. Because of these obvious advantages, NPUs are growing more common, particularly in smartphones. They are already used in TPU by Google and the Neural Engine developed by Apple.

How exactly is NPU different from GPU and CPU?

There are significant distinctions between the three, but overall, the NPU is closer to the GPU in that both are designed for parallel processing. It’s just that NPUs are more specialised, focusing solely on neural network tasks, whereas GPUs are more versatile and can handle a wider range of parallel computing tasks.

In terms of core count, the exact number can vary greatly depending on the specific model and manufacturer. However, the general trend is that CPUs have fewer, more powerful cores, GPUs have many cores optimised for parallel tasks (often in thousands), and NPUs have cores specialised for AI computations.

NPU adoption in smartphones :

Following their debut a few years ago, NPU silicon has rapidly improved and been integrated into smartphones by nearly all major vendors.

Apple led the charge by introducing the Neural Engine NPU in its A11 mobile chipset and higher-end iPhone models back in 2017. Huawei similarly unveiled an NPU with its Kirin 970 system on a chip in 2018. Qualcomm, the dominant Android mobile platform vendor also rolled out its AI Engine integrated into its premium 800 series chipsets. And more recently, Qualcomm is focusing on on-device generative AI, powered by the NPU on the Snapdragon 8 Gen 3. MediaTek and Samsung have also followed suit by baking NPUs into their latest offerings.

© Indian Express.

Note : This news is only for students, for the purpose of enhancing their knowledge. This news is collected from several companies, the copyrights of this news also belong to those companies like : BBC, CNN, Times of India, Reuters, The Verge, Indian Express, Tech Crunch, News18, Mint, Hindustan Times, Business Today, Techgig etc,.