Select Language

English

Down Icon

Select Country

Spain

Down Icon

AMD introduces new AI networking solutions with the Pollara 400 AI NIC and Salina DPU

AMD introduces new AI networking solutions with the Pollara 400 AI NIC and Salina DPU
AMD unveiled its new networking technologies designed for data centers running AI workloads at the Advancing AI 2025 event. These include the Pollara 400 AI NIC and the Salina 400 DPU.
The Pollara 400 AI NIC is designed to accelerate inter-GPU communication in distributed clusters. It incorporates the RCCL collective communication library and UEC congestion control technology.
According to internal testing, Pollara offers up to 25% more performance than comparable solutions and reduces infrastructure costs by 16% compared to InfiniBand networks.
Additionally, its programmable architecture allows for network operations to be executed as part of AI workloads, without relying on proprietary interconnection infrastructure.
The Salina 400 DPU delivers up to 400 Gbps of bandwidth, enabling network, security, and storage functions to be accelerated up to 40 times compared to CPU-based solutions. It also incorporates error recovery and redundancy mechanisms that improve system availability by up to 10%.
These solutions are already operational in Oracle Cloud Infrastructure and Microsoft Azure data centers, validating their use in production environments.
eltiempo

eltiempo

Similar News

All News
Animated ArrowAnimated ArrowAnimated Arrow