Protokoll | 100 Gigabit Ethernet, 100 Gigabit InfiniBand |
Hastighet | 100 Gbps |
Grensesnitt | PCI Express 3.0 x16 |
- Industry-leading throughput, low latency/CPU utilization and high message rate
- Innovative rack design for storage and ML based on Host Chaining technology
- Smart interconnect for x86, Power, ARM, and GPU-based compute and storage
- Advanced storage capabilities including NVMe over Fabric offloads
- Cutting-edge performance in virtualized networks including NFV
- Enabler for efficient service chaining capabilities
- Enhanced vSwitch offloads
- Adaptive routing on reliable transport
- NVMe over Fabric (NVMf) target offloads
- Hardware offloads for NVGRE and VXLAN encapsulated traffic
- End-to-end QoS and congestion control
NVIDIA Mellanox ConnectX-5-adaptere tilbyr avansert maskinvareavlastning som reduserer CPU-ressursforbruket og gir ekstremt høye pakkehastigheter og gjennomstrømning. Dette øker effektiviteten i datasenterets infrastruktur og gir den mest fleksible løsningen med høyest ytelse for Web 2.0-, sky-, dataanalyse- og lagringsplattformer.