Inference becomes a driver
Most AI-related investments were focused on supporting hyperscalers’ internal frontier models. These models allow them to vertically integrate AI capabilities, reduce dependence on third-party labs, optimize their massive compute infrastructure, and secure a competitive edge in AI service provision.
Dell’Oro noted that while most AI investments are dedicated to training workloads, inferencing will likely become a larger capex driver in the future, especially as token-hungry reasoning models become more widespread.
“This heightened level of investment raises the potential for overcapacity in AI infrastructure, although hyperscalers are taking proactive measures to mitigate risks and optimize costs,” Fung said.
Dell and Supermicro remain dominant
From a vendor perspective, Dell led all OEMs in AI-optimized server revenue in 2025, followed by Supermicro, driven by strong shipments of NVIDIA Blackwell.
Meanwhile, White-box vendors captured the majority of server shipments, supported by hyperscale AI deployments for Blackwell and custom systems, as well as a surge in demand for general-purpose servers for compute and storage workloads.
Dell’Oro has forecast that average selling prices for general-purpose servers will rise by high double digits in 2026, with escalating DRAM and storage prices as the primary growth driver.
For related articles, visit the Data Center Topic Center.
For more information on high-speed transmission systems and suppliers, visit the Lightwave Buyer’s Guide.
To stay abreast of fiber network deployments, subscribe to Lightwave’s Service Providers and Datacom/Data Center newsletters.