22V Research · Jordi Visser

The Inference Inflection — Thematic Basket

A 22-company equal-weight basket spanning chips, memory, networking, data centers, cloud, and edge/embodied-AI enablers — Visser's bet that AI workloads are pivoting from training to real-time inference at scale.

Source report: "The Inference Inflection: Where Real-Time AI Meets Real-World Opportunity" Published: May 15, 2025 Constituents: 22 (equal-weight) Archived from ai.22vresearch.com

AI Compute & Inference Chips

GPUs, accelerators, and edge inference silicon.

  • NVDA
    NVIDIA
    Market leader in training & inference GPUs; inference now driving data-center growth.
  • AMD
    Advanced Micro Devices
    Momentum with MI300 and distributed inference models.
  • INTC
    Intel
    Gaudi AI chips gaining traction; Ethernet & inference-optimized silicon on roadmap.
  • QCOM
    Qualcomm
    Dominant at the edge — phones, automotive, IoT — with custom low-power inference chips.
  • LSCC
    Lattice Semiconductor
    FPGA-based low-power inference plays at the edge.

Memory & Storage

HBM, DRAM, and edge memory beneficiaries.

  • MU
    Micron Technology
    Leader in HBM3E and low-latency DRAM used for inference workloads.
  • WDC
    Western Digital
    Drives inference-related data center storage needs.
  • STX
    Seagate Technology
    Supports growing edge/cloud storage requirements tied to inference.

Networking & Optics

Ultra-low-latency, high-bandwidth connectivity for inference.

  • CIEN
    Ciena Corporation
    Optical backbone provider; strong AI data-center build exposure.
  • ANET
    Arista Networks
    Top high-speed networking player for hyperscalers deploying inference.
  • MRVL
    Marvell Technology
    Custom silicon and optical interconnects powering inference networks.
  • AVGO
    Broadcom
    Dominant in networking chips, PCIe, and switching — core enabler of inference scale-out.

Data Centers & Infrastructure

Capacity, power, and cooling for inference-heavy facilities.

  • EQIX
    Equinix
    Hyperscaler growth and latency-sensitive inference demand driving record colocation needs.
  • DLR
    Digital Realty
    Real-time AI workloads shifting inference into new facilities.
  • VRT
    Vertiv Holdings
    Power and cooling infrastructure for inference-heavy data centers.

AI-Optimized Cloud Providers

Large-scale inference customers and usage-based cloud beneficiaries.

  • MSFT
    Microsoft
    Azure's 100-trillion token quarter is the proof point.
  • GOOG
    Alphabet
    Google Cloud inference usage expanding fast.
  • AMZN
    Amazon
    AWS seeing a shift toward inference-driven demand.

Edge & Embodied AI Enablers

Robotics, smart devices, FSD, and the next leg of inference.

  • AMBA
    Ambarella
    Edge AI chips used in automotive and vision systems.
  • ON
    ON Semiconductor
    Inference-capable image sensing and power semis for AI-native systems.
  • NXPI
    NXP Semiconductors
    Automotive AI and edge inference control.
  • SYNA
    Synaptics
    AI-powered interfaces and inference at the edge — phones, consumer, IoT.

Why this basket

Visser's thesis: training-era AI narratives have run their course. The next leg of value creation is inference at scale — perpetual, usage-driven workloads that scale with every user interaction, every agent call, every embodied-AI decision. The basket is intentionally semiconductor-heavy: an equal-weight chart of these names tracks closely to the SOX index.

Each name covers a distinct slice of the inference stack so the basket captures the full buildout: silicon → memory → fabric → real estate → cloud → edge.

22
Companies
6
Sub-themes
~26%
Basket vs ~13% SPX*
May 15
2025
Publish date

*Equal-weight performance vs. SPX cited by Visser in the Oracle, Inference, and the Persistence of Disbelief follow-up report.