Quick Specs
- Product Name: NVIDIA A100 40GB x16 PCIe GPU
- Model: A100
- GPU Memory: 40 GB HBM2e
- Memory ECC: Yes
- Memory Bandwidth: 1935 GB/sec
- Max Power Consumption: 300W
- Interface: PCIe Gen4 x16 / NVLink bridge capable
- Interconnect Bandwidth: 64 GB/sec (PCIe 4.0)
- Slot Width: Dual Width (DW)
- Card Dimensions: Full Height, Full Length (FHFL)
- Auxiliary Power: 8-pin CPU-style connector
- Workloads: HPC, AI, Database Analytics
Model Identifier:
- A100 40GB PCIe
The NVIDIA A100 40GB PCIe GPU delivers transformative acceleration for AI, data analytics, and high-performance computing (HPC) workloads. Built on the NVIDIA Ampere architecture, it features 40 GB of high-bandwidth HBM2e memory and supports dynamic GPU partitioning into seven isolated instances using Multi-Instance GPU (MIG) technology.
With a bandwidth of up to 1935 GB/sec and 300W of power consumption, the A100 PCIe model provides elastic performance for demanding data center environments. Its dual-slot, full-height, full-length form factor with PCIe Gen4 x16 connectivity ensures broad compatibility in enterprise-scale deployments.































