ADVERTISEMENT

Nvidia CEO Says New Rubin Chips Are On Track, Helping Speed AI

Huang’s remarks signal that Nvidia is maintaining its edge as the leading maker of artificial intelligence accelerators, the chips used by data center operators to develop and run AI models.

<div class="paragraphs"><p>Jensen Huang was the keynote speaker at the CES 2026. (Photo: Bloomberg)</p></div>
Jensen Huang was the keynote speaker at the CES 2026. (Photo: Bloomberg)
Show Quick Read
Summary is AI Generated. Newsroom Reviewed

Nvidia Corp.’s highly anticipated new Rubin data center products are nearing release this year, and customers will soon be able to try out the technology, helping speed AI development.

All six of the new Rubin chips are back from manufacturing partners and they’ve already passed some of the milestone tests that show they’re on track for deployment by customers, Nvidia said. Chief Executive Officer Jensen Huang touted the products during a keynote presentation at the CES trade show in Las Vegas on Monday.

“The race is on for AI,” he said. “Everybody’s trying to get to the next level.”

Huang’s remarks signal that Nvidia is maintaining its edge as the leading maker of artificial intelligence accelerators, the chips used by data center operators to develop and run AI models.

Rubin is Nvidia’s latest accelerator and is 3.5 times better at training and five times better at running AI software than its predecessor, Blackwell, the company said. A new central processing unit has 88 cores — the key data-crunching elements — and provides twice the performance of the component that it’s replacing.

The company is giving details of its new products earlier in the year than it typically does — part of a push to keep the industry hooked on its hardware, which has underpinned an explosion in AI use. Nvidia usually dives into product details at its spring GTC event in San Jose, California.

For Huang, CES is yet another stop on his marathon run of appearances at events, where he’s announced products, tie-ups and investments all aimed at adding momentum to the deployment of AI systems. His counterpart at Nvidia’s closest rival, Advanced Micro Devices Inc.’s Lisa Su, will give a keynote presentation at the show later Monday.

Some on Wall Street have expressed concern that competition is mounting for Nvidia — and that AI spending can’t continue at its current pace. Data center operators are also developing their own AI accelerators. But Nvidia has maintained bullish long-term forecasts that point to a total market in the trillions of dollars.

The new hardware, which also includes networking and connectivity components, will be part of its DGX SuperPod supercomputer while also being available as individual products for customers to use in a more modular way. The step-up in performance is needed because AI has shifted to more specialised networks of models that not only sift through massive amounts of inputs but need to solve particular problems through multistage processes.

The company emphasised that Rubin-based systems will be cheaper to run than Blackwell versions because they’ll return the same results using fewer components. Microsoft Corp. and other large providers of remote computing will be among the first to deploy the new hardware in the second half of the year, Nvidia said.

For now, the majority of spending on Nvidia-based computers is coming from the capital expenditure budgets of a handful of customers, including Microsoft, Alphabet Inc.’s Google Cloud and Amazon.com Inc.’s AWS. Nvidia is pushing software and hardware aimed at broadening the adoption of AI across the economy, including robotics, health care and heavy industry.

As part of that effort, Nvidia announced a group of tools designed to accelerate development of autonomous vehicles and robots.

OUR NEWSLETTERS
By signing up you agree to the Terms & Conditions of NDTV Profit