The early innings of the artificial intelligence (AI) infrastructure buildout have been dominated by training, as companies rush to create the best AI models. However, according to reports, the AI inference market could climb from around $106 billion to nearly $255 billion by 2030.
Let’s look at three stocks set to benefit from this upward trend.
Image source: Getty Images.
Nvidia
While Nvidia (NVDA +1.44%) is known for its dominance in large language model (LLM) training, the company is also the leader in AI inference. Through Nvidia NIM (Nvidia Inference Microservices), the company offers prebuilt, optimized inference microservices. Meanwhile, its Blackwell GB300 Ultra graphics processing units (GPUs) have been optimized for inference and agentic AI, while its upcoming Vera Rubin platform is expected to continue to improve upon its inference performance.
Expand
NASDAQ: NVDA
Nvidia
Today’s Change
(1.44%) $2.77
Current Price
$195.62
Key Data Points
Market Cap
$4.7T
Day’s Range
$193.80 - $197.62
52wk Range
$86.62 - $212.19
Volume
6.6M
Avg Vol
170M
Gross Margin
70.05%
Dividend Yield
0.02%
However, it is the company’s acquisition of Groq’s employees and licensing of its technology that could really set it up to be an AI inference winner. Groq (owned by X) developed a new type of chip called language processing units (LPUs) designed specifically for AI inference. Nvidia plans to integrate these chips into its CUDA software platform and networking infrastructure to improve its inference offering. As such, I wouldn’t overlook Nvidia in the inference market, where it should continue to be a winner.
Advanced Micro Devices
Since Nvidia’s CUDA moat isn’t as wide in inference as it is in training, this opens the door for Advanced Micro Devices (AMD 1.37%) to take some share. The company has already done a nice job carving out a niche in the inference market, so the overall growth of the market should benefit it, especially given its much smaller revenue base than Nvidia.
Expand
NASDAQ: AMD
Advanced Micro Devices
Today’s Change
(-1.37%) $-2.92
Current Price
$210.92
Key Data Points
Market Cap
$349B
Day’s Range
$210.33 - $216.67
52wk Range
$76.48 - $267.08
Volume
1.7M
Avg Vol
36M
Gross Margin
45.99%
Meanwhile, AMD is set to benefit from an investment by OpenAI and a commitment from the start-up to use 6 gigawatts worth of its GPUs. With 1 gigawatt of chips worth about $35 billion based on the price of Nvidia GPUs, that’s a big upcoming growth driver for the company. OpenAI will use them specifically for inference, so this could also open the door for inference deals with other companies, as well.
Also not to be overlooked in the AMD story is the importance of central processing units (CPUs) when it comes to agentic AI. CPUs act more as the brains of a computer, and with AI agents, they are becoming a more important part of the AI infrastructure story. Between increasing AI inference and data center CPU demand, AMD looks well-positioned for the future.
Broadcom
As companies look to reduce AI infrastructure compute costs, they have been increasingly turning to AI ASICs (application-specific integrated circuits). ASICs are custom chips that are hardwired for specific tasks, and as such, they tend to perform these tasks very well while also being more energy-efficient. This becomes increasingly important with inference since it is an ongoing cost that consumes power every time it answers a query or completes a task.
As a leader in ASIC technology, Broadcom (AVGO +2.05%) is one of the best ways to play this trend. The company provides the building blocks to help take its customers’ chip designs and turn them into physical chips. Meanwhile, it also has important relationships with memory makers and foundries to secure important components and manufacturing capacity for these chips so that they can be manufactured at scale.
Expand
NASDAQ: AVGO
Broadcom
Today’s Change
(2.05%) $6.67
Current Price
$332.16
Key Data Points
Market Cap
$1.5T
Day’s Range
$329.30 - $335.89
52wk Range
$138.10 - $414.61
Volume
759K
Avg Vol
31M
Gross Margin
64.71%
Dividend Yield
0.74%
Broadcom helped Alphabet design its highly regarded tensor processing units (TPUs), and this alone is a big opportunity, especially as Alphabet is now letting customers deploy TPUs through Google Cloud. Anthropic has already placed a $21 billion TPU order with Broadcom for this year, while a nice chunk of Alphabet’s approximate $180 billion in capital expenditures this year will likely go to TPUs, as well. Meanwhile, the company is bringing in new ASIC customers, including OpenAI, which has committed to 10 gigawatts worth of chips.
With the inference market set to surge, Broadcom looks poised to be one of the biggest winners in the chip space.
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
The Artificial Intelligence (AI) Inference Market Could Reach $255 Billion by 2030. These Stocks Are Best Positioned to Win.
The early innings of the artificial intelligence (AI) infrastructure buildout have been dominated by training, as companies rush to create the best AI models. However, according to reports, the AI inference market could climb from around $106 billion to nearly $255 billion by 2030.
Let’s look at three stocks set to benefit from this upward trend.
Image source: Getty Images.
Nvidia
While Nvidia (NVDA +1.44%) is known for its dominance in large language model (LLM) training, the company is also the leader in AI inference. Through Nvidia NIM (Nvidia Inference Microservices), the company offers prebuilt, optimized inference microservices. Meanwhile, its Blackwell GB300 Ultra graphics processing units (GPUs) have been optimized for inference and agentic AI, while its upcoming Vera Rubin platform is expected to continue to improve upon its inference performance.
Expand
NASDAQ: NVDA
Nvidia
Today’s Change
(1.44%) $2.77
Current Price
$195.62
Key Data Points
Market Cap
$4.7T
Day’s Range
$193.80 - $197.62
52wk Range
$86.62 - $212.19
Volume
6.6M
Avg Vol
170M
Gross Margin
70.05%
Dividend Yield
0.02%
However, it is the company’s acquisition of Groq’s employees and licensing of its technology that could really set it up to be an AI inference winner. Groq (owned by X) developed a new type of chip called language processing units (LPUs) designed specifically for AI inference. Nvidia plans to integrate these chips into its CUDA software platform and networking infrastructure to improve its inference offering. As such, I wouldn’t overlook Nvidia in the inference market, where it should continue to be a winner.
Advanced Micro Devices
Since Nvidia’s CUDA moat isn’t as wide in inference as it is in training, this opens the door for Advanced Micro Devices (AMD 1.37%) to take some share. The company has already done a nice job carving out a niche in the inference market, so the overall growth of the market should benefit it, especially given its much smaller revenue base than Nvidia.
Expand
NASDAQ: AMD
Advanced Micro Devices
Today’s Change
(-1.37%) $-2.92
Current Price
$210.92
Key Data Points
Market Cap
$349B
Day’s Range
$210.33 - $216.67
52wk Range
$76.48 - $267.08
Volume
1.7M
Avg Vol
36M
Gross Margin
45.99%
Meanwhile, AMD is set to benefit from an investment by OpenAI and a commitment from the start-up to use 6 gigawatts worth of its GPUs. With 1 gigawatt of chips worth about $35 billion based on the price of Nvidia GPUs, that’s a big upcoming growth driver for the company. OpenAI will use them specifically for inference, so this could also open the door for inference deals with other companies, as well.
Also not to be overlooked in the AMD story is the importance of central processing units (CPUs) when it comes to agentic AI. CPUs act more as the brains of a computer, and with AI agents, they are becoming a more important part of the AI infrastructure story. Between increasing AI inference and data center CPU demand, AMD looks well-positioned for the future.
Broadcom
As companies look to reduce AI infrastructure compute costs, they have been increasingly turning to AI ASICs (application-specific integrated circuits). ASICs are custom chips that are hardwired for specific tasks, and as such, they tend to perform these tasks very well while also being more energy-efficient. This becomes increasingly important with inference since it is an ongoing cost that consumes power every time it answers a query or completes a task.
As a leader in ASIC technology, Broadcom (AVGO +2.05%) is one of the best ways to play this trend. The company provides the building blocks to help take its customers’ chip designs and turn them into physical chips. Meanwhile, it also has important relationships with memory makers and foundries to secure important components and manufacturing capacity for these chips so that they can be manufactured at scale.
Expand
NASDAQ: AVGO
Broadcom
Today’s Change
(2.05%) $6.67
Current Price
$332.16
Key Data Points
Market Cap
$1.5T
Day’s Range
$329.30 - $335.89
52wk Range
$138.10 - $414.61
Volume
759K
Avg Vol
31M
Gross Margin
64.71%
Dividend Yield
0.74%
Broadcom helped Alphabet design its highly regarded tensor processing units (TPUs), and this alone is a big opportunity, especially as Alphabet is now letting customers deploy TPUs through Google Cloud. Anthropic has already placed a $21 billion TPU order with Broadcom for this year, while a nice chunk of Alphabet’s approximate $180 billion in capital expenditures this year will likely go to TPUs, as well. Meanwhile, the company is bringing in new ASIC customers, including OpenAI, which has committed to 10 gigawatts worth of chips.
With the inference market set to surge, Broadcom looks poised to be one of the biggest winners in the chip space.