Qualcomm Inc. kicked off its biggest stock rally since 2019 after unveiling chips and computers for the lucrative AI data center market, aiming to challenge Nvidia Corp. in the fastest-growing part of the industry.
The company’s new AI200 lineup will start shipping next year, Qualcomm said on Monday. The first customer will be Saudi Arabia’s AI startup Humain, which plans to deploy 200 megawatts’ worth of computing systems based on the chips starting in 2026.
Qualcomm is trying to break into the market for AI accelerators, which are used to create and run artificial intelligence models. It’s an area that’s already transformed the semiconductor industry, with hundreds of billions of dollars being spent on data centers to power AI software and services. The torrid growth has turned market leader Nvidia into the world’s most valuable company.
Qualcomm, the largest maker of smartphone processors, looks to gain a foothold in this market with a different approach. It argues that new memory-related capabilities and the power efficiency of Qualcomm’s designs — that owe their roots to mobile device technology — will attract customers, despite its relatively late entry.
Shares of Qualcomm climbed as much as 22% to $205.95. Arm Holdings Plc, which develops some of the underlying technology used by Qualcomm, gained as well. It was up 4.9% to $179.01 as of 11:54 a.m. in New York.
The Humain deal suggests there’s “early traction” for Qualcomm’s new AI accelerators, Bloomberg Intelligence analysts Kunjan Sobhani and Oscar Hernandez Tejada said in a note.
“Though it’s too soon to call this a serious challenge to Nvidia’s dominance, even modest share gains in the over $500 billion AI accelerator market could translate into billions in incremental revenue,” they said.
Qualcomm’s AI200 product will be offered in a range of forms: a standalone component, cards that can be added into existing machines or as part of a full rack of servers provided by Qualcomm. Those debut products will be followed by the AI250 in 2027, the San Diego company said.
If supplied only as a chip, the component could work inside gear that’s based on processors from Nvidia or other rivals. As a full server, it will compete with offerings from those chipmakers.
The new offerings are built around a neural processing unit, a type of chip that debuted in smartphones and is designed to speed up AI-related workloads without killing battery life. That capability has been developed further through Qualcomm’s move into laptop chips and has now been scaled up for use in the most powerful computers.
Under Chief Executive Officer Cristiano Amon, Qualcomm is trying to diversify away from its dependence on smartphones, which are no longer increasing sales as quickly as they once did. The company has branched out into chips for cars and PCs, but is only now offering a product in what’s become the biggest single market for processors.
Qualcomm has been “quiet in this space, taking its time and building its strength,” according to Durga Malladi, a company senior vice president. Qualcomm is in talks with all of the biggest buyers of such chips on deploying server racks based on its hardware, he said.
Winning orders from companies such as Microsoft Corp., Amazon.com Inc. and Meta Platforms Inc. would offer a significant new revenue source for Qualcomm. The company has posted solid, profitable growth over the last two years, but investors have favored other tech stocks. Qualcomm shares had gained 10% in 2025 through the end of last week, lagging behind a 40% surge by the Philadelphia Stock Exchange Semiconductor Index.
Nvidia, which remains atop the AI computing world, is on target to generate more than $180 billion in revenue from its data center unit this year, more than any other chipmaker – including Qualcomm – will get in total, according to estimates.
Qualcomm’s new chips will offer an unprecedented amount of memory, according to Malladi. They’ll have as many as 768 gigabytes of low-power dynamic random access memory.
Memory access speed and capacity are crucial to the rate at which the chips can make sense of the mountain of data handled in AI work. The new chips and computers are aimed at inference work, the computing behind running AI services once the large language models that underpin the software have been trained.
AI growth also could help Qualcomm offset a loss of business from Apple Inc., according to Bloomberg Intelligence. After years of generating about 20% of Qualcomm’s revenue, the iPhone maker is switching to in-house chips.