Watch this deep-dive expert call hosted by AlphaSense Partner Rihard Jarc featuring a former Director of Strategic Alliances, NVIDIA AI Enterprise, AI ISVs & vGPU, which offers rare insights into Nvidia’s competitive positioning in the rapidly expanding AI inference market.
While Nvidia dominates AI training, inference workloads are growing even faster — and they come with different demands. In this session, the expert and Rihard Jarc explore:
- The Inference Opportunity: How big is the AI inference market, and how are workload patterns evolving?
- Inference vs Training Architectures: How test-time compute changes chip requirements. The importance of HBM and latency.
- Nvidia vs ASICs: Who’s winning in inference — and why? Which hyperscalers are best positioned with custom ASICs?
- ASIC Maturity: Comparing key offerings across hyperscalers. What's real today, and what's still on the roadmap?
- AMD vs Nvidia: A full-stack comparison — hardware, software, and ecosystem. Is ROCm catching up to CUDA?
- Networking & Interconnects: How Nvidia’s networking advantage could sustain its moat. Can anyone match NVLink and NVSwitch?
- Risks & Bottlenecks: What could slow down AI infrastructure growth — supply chain, power, or software maturity?
If you're serious about understanding Nvidia’s future — and how AMD, Broadcom, and hyperscaler ASICs from Google, Amazon, Microsoft, and Meta fit into the picture — this is a call you don’t want to miss.
Expect sharp analysis, fresh perspectives, and actionable insights.




