Hi Newsdesk, Akamai, the cybersecurity and cloud computing company, has announced the first global implementation of Nvidia's AI Grid<https://www.akamai.com/newsroom/press-release/akamai-launches-ai-grid-intelligent-orchestration-for-distributed-inference-across-4400-edge-locations>. The announcement represents a major milestone in the evolution of AI, moving beyond isolated AI factories and towards a unified and distributed grid for AI inference.
At NVIDIA GTC 2026 in San Jose, Akamai Technologies announced Akamai Inference Cloud, the first global-scale implementation of NVIDIA AI Grid <https://www.nvidia.com/en-us/glossary/ai-grid/> - AI Grid is a reference architecture designed to distribute AI inference workloads across centralized data centers and edge infrastructure for better latency and cost efficiency.
Akamai's platform integrates thousands of GPUs and uses intelligent orchestration to route AI workloads across its network of 4,400+ edge locations, regional clouds, and core data centers, enabling low-latency, real-time AI applications such as gaming NPC interactions, fraud detection, and live media processing.
Expanded GPU capabilities to power a globally distributed AI compute grid for next-generation inference applications (link<https://www.akamai.com/newsroom/press-release/akamai-to-deploy-thousands-of-nvidia-blackwell-gpus-to-create-one-of-the-worlds-most-widely-distributed-ai-platforms>)
A newly secured four-year, $200M service agreement with a major US technology company to deploy NVIDIA Blackwell GPU clusters (link<https://www.akamai.com/newsroom/press-release/akamai-discloses-technical-details-on-ai-cluster-deal>)
Launch of Inference Cloud platform powered by Nvidia Blackwell AI architecture (link<https://www.akamai.com/newsroom/press-release/akamai-inference-cloud-transforms-ai-from-core-to-edge-with-nvidia>)