Placeholder canvas

Google’s taking the fight to NVIDIA, launches new ARM-based processors to power data centres

Currently, NVIDIA, Intel, and AMD dominate the data center space regarding processors. NVIDIA, in particular, has a stranglehold on data centers working with AI models and ML algorithms. However, Google wants a piece of that action.

At the ongoing Cloud Next, Google announced its new ARM-based processor, Axion. Axion is Google’s first ARM-based CPU, created explicitly for data centers and based on ARM’s Neoverse V2 CPU.

Suppose Google’s stats are anything to go by. In that case, Axion performs about 30 percent better than its fastest general-purpose ARM-based tools in the cloud and about 50 percent better than the most recent, comparable x86-based virtual machines.

Google also claims Axion is at least 60 percent more energy efficient than the same x86-based VMs.

Apparently, Google already uses Axion in services like BigTable and Google Earth Engine and hopes to expand its use.

The release of Axion could bring Google into direct competition with Amazon, which has led the field of ARM-based CPUs for data centers, along with NVIDIA and Ampere. However, if we look at other kinds of processors, Google is also going up against Intel and AMD.

Amazon’s cloud business, Amazon Web Services AWS, released the Graviton processor in 2018 and then released the second and third iterations over the following two years. Meanwhile, NVIDIA released its first Arm-based CPU, Grace, for data centers in 2021.

Google started developing its own ARM-based processors years ago, but it has mainly focused on personal devices like smartphones. The first devices to ship with Google’s ARM-based processor were the Pixel 6 and 6 Pro smartphones back in 2021, which used a then-new SoC called Tensor.

Since then, all Pixel devices have been powered by updated iterations of the Tensor SoC. The Tensor SoC originates from Tensor Processing Units, or TPUs, which Google developed to use in its data centers. They started using them in 2015, and Google made them available to other parties from 2018 onwards.

Arm-based processors often cost less than regular processors and are usually more energy-efficient. AI models such as ChatGPT are resource-hungry and need a ton of electricity and water. Experts estimate that By the end of the decade, AI data centers could consume as much as 20 percent to 25 percent of the US’s current power requirements, which isn’t a sustainable model for AI’s development.

Share your love
Facebook
Twitter
LinkedIn
WhatsApp

Newsletter

Follow Us

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

error: Unauthorized Content Copy Is Not Allowed