The race for more powerful and efficient AI hardware surged ahead this week with Intel and Google announcing new chips to help them become less reliant on NVIDIA tech.
It seems like every week new AI models are being released. Behind each release are weeks of training on cloud computing data centers, most of which are powered by NVIDIA GPUs.
Intel and Google both announced new in-house AI chips that can train and deploy big AI models faster while using less power.
Intel’s Gaudi 3 AI accelerator chip
Intel is probably better known for the chips that power your PC but on Tuesday the company announced its new AI chip called Gaudi 3.
NVIDIA’s H100 GPUs have formed the bulk of AI data centers, but Intel says Gaudi 3 delivers “50% on average better inference and 40% on average better power efficiency than Nvidia H100 – at a fraction of the cost.â€
A big contributor to the power efficiency of Gaudi 3 is that Intel used Taiwan Semiconductor Manufacturing Co’s 5nm process to make the chips.
Intel didn’t give any pricing information but when asked how it compares with NVIDIA’s products, Das Kamhout, VP of Xeon software at Intel said, “We do expect it to be highly competitive.â€
Dell Technologies, Hewlett Packard Enterprise, Lenovo, and Supermicro will be the first to deploy Gaudi 3 in their AI data centers.
Intel CEO Pat Gelsinger summarized the company’s AI ambitions saying “Intel is bringing AI everywhere across the enterprise, from the PC to the data center to the edge.â€
The #IntelGaudi 3 #AI accelerator offers a highly competitive alternative to NVIDIA’s H100 with higher performance, increased scalability, and PyTorch integration. Explore more key product benefits. https://t.co/sXdQKjYFw0 pic.twitter.com/iJSndBQkvT
— Intel AI (@IntelAI) April 9, 2024
Google’s Arm and TPU upgrades
On Tuesday Google announced its first custom Arm-based CPUs it plans to use to power its data centers. The new chip, dubbed Axion, is a direct competitor to Intel and AMD’s CPUs.
Google claims Axion delivers “30% better performance than the fastest general-purpose Arm-based instances available in the cloud today, up to 50% better performance and up to 60% better energy-efficiency than comparable current-generation x86-based instances.â€
Google’s new Arm-based Axion CPU. Source: Google
Google has been moving several of its services like YouTube and Google Earth to its current generation Arm-based servers which will soon be upgraded with the Axion chips.
Having a powerful Arm-based option makes it easier for customers to migrate their CPU-based AI training, inferencing, and other applications to Google’s cloud platform without having to redesign them.
For large-scale model training, Google has largely relied on its TPU chips as an alternative to NVIDIA’s hardware. These will also be upgraded with a single new TPU v5p now containing more than double the number of chips in the current TPU v4 pod.
TPU v5p, our most powerful and scalable TPU, is now generally available! #GoogleCloudNext pic.twitter.com/mmfWlzHeqs
— Google Cloud Tech (@GoogleCloudTech) April 9, 2024
Google isn’t looking to sell either its new Arm chips or TPUs. The company is looking to drive its cloud computing services rather than becoming a direct hardware competitor of NVIDIA.
The upgraded TPUs will provide a boost to Google’s AI Hypercomputer which enables large-scale AI model training. The AI Hypercomputer also uses NVIDIA H100 GPUs that Google says will soon be replaced with NVIDIA’s new Blackwell GPUs.
The demand for AI chips isn’t likely to slow down anytime soon, and it’s looking less like the one-horse NVIDIA race than before.
The post Intel and Google unveil new AI chips to compete with NVIDIA appeared first on DailyAI.
Source: Read MoreÂ