Nvidia to Enter the $30B Custom Chip Market
Nvidia plans to enter the $30 billion custom chip market with a new business unit responsible for designing sophisticated chips for cloud computing firms and other clients and building advanced AI processors, several sources familiar with the plans told Reuters.
Nvidia wants to enter the potent custom AI chip market to protect itself from the growing number of clients seeking cheaper alternatives for its products. Major clients typically use Nvidia’s all-purpose H100 and A100 chips, but they aren’t always the most cost-effective solution given that prices range between $16,000 and $100,000 per chip.
“If you’re really trying to optimize on things like power, or optimize on cost for your application, you can’t afford to go drop an H100 or A100 in there,” Greg Reichow, general partner at venture capital firm Eclipse Ventures, told Reuters. “You want to have the exact right mixture of compute and just the kind of compute that you need.”
Nvidia’s biggest clients, which include Amazon, Google, Meta, and Microsoft, are already working on or using proprietary AI chips for AI development alongside Nvidia’s graphics processing units (GPUs) for cloud computing.
Nvidia currently dominates the AI chip market with an 80% market share. Its market value is up 40% this year, totaling $1.73 trillion – figures that could go downhill if clients decide to pursue in-house production or turn to competitors like Broadcom and Marvell Technology. These two companies already dominate the custom silicon design for the data centers market, which comprises 5% of global annual chip sales.
Nvidia intends to tap into this market by allowing clients to leverage its in-house intellectual property and technology to accelerate and optimize their custom chip design efforts while leaving the actual chip manufacturing to Taiwan Semiconductor Manufacturing Co. or another chip manufacturer.
Nvidia officials have already met with representatives from Amazon, Google, Microsoft, Meta, and OpenAI to pitch making custom chips for them, Reuters confirmed from two sources familiar with the meetings.
OpenAI CEO Sam Altman, who has long been complaining about the scarcity of GPUs required to train large language models like the one that powers ChatGPT, might decide to make his own chips. He’s reportedly seeking between $5 and $7 trillion to take over the AI chip-building market, which could put Nvidia at a disadvantage.
Besides data center chips, Nvidia is also in talks with potential clients from the automotive, telecom, and video game industries.