CTO Servers
CTO Storage
HPE
Dell
NetApp
EMC
EqualLogic
Cisco
DPE-M630-2x2.5-VRTX-Chassis

Cisco Nvidia H100 Nvl 94GB x16 PCI-e 350W Passive DW FH GPU | UCSX-GPU-H100-NVL

Thanks for the request. We'll get back to you with in 24 hours.

Quick Specs

  • Model: H100
  • GPU Memory: 96GB
  • Memory Bandwidth: 3.9TB/s
  • Max Power Consumption: 350-400W (configurable)
  • Graphic Bus / System Interface: PCIe Gen5x16/ NVIDIA NVLink: 600GB/s
  • Interconnect Bandwidth: 128 GB/sec (PCIe 5.0)
  • Slot Width: DW
  • NVIDIA AI Enterprise: Included

Supercharge Large Language Model Inference With H100 NVL

For LLMs up to 70 billion parameters (Llama 2 70B), the PCIe-based NVIDIA H100 NVL with NVLink bridge utilizes Transformer Engine, NVLink, and 188GB HBM3 memory to provide optimum performance and easy scaling across any data center, bringing LLMs to the mainstream. Servers equipped with H100 NVL GPUs increase Llama 2 70B performance up to 5X over NVIDIA A100 systems while maintaining low latency in power-constrained data center environments.

Compatible With:
  • Cisco Series X440p

(Click to View)



Welcome to the Request Express Quoting Form!

This streamlined form is designed to simplify the quoting process for you. Whether you're uncertain about specific parts or seeking a hassle-free experience, you're in the right place.
Provide us with the necessary details, and we'll generate a customized quote for your server. Even if you're unsure about the exact components you need, we've got you covered. Our team will tailor a solution based on the information you provide.

Let's get started!
Simply fill out the form, and we'll take care of the rest.


Enter your name... *
Enter your email address... *
Message *
Preferred Brand
Form Factor

Rack Unit of Height (#U)

Total Amount of Memory
Total Amount of Storage

Storage Type

RAID


List Networking Connection needed:
Rack Space Restrictions

Which OS do you plan on using: