Share

GB10181N

Unrivaled Air-Cooled HGX H100 8-GPU AI Server for Ultimate AI Performance
Support NVIDIA HGX H100 8-GPU
Deploy Latest NVIDIA NVLink® & NVSwitch™ Technology
Two-layer GPU & IO Sleds for Easy Scalability and Better Thermal Efficiency
Support 20 x PCIe 5.0 Slots & 32 x NVMe Drive Bays
Air Cooling Solution
Powered by
NVIDIA

Application

Generative AI

Generative AI

Large Language Model (LLM)

Large Language Model (LLM)

Hyperscale Data Center

Hyperscale Data Center

High Performance Computing

High Performance Computing

AI and Machine Learning

AI and Machine Learning

hga_h100

Unrivaled HGX H100 8-GPU AI Server

Purpose-built to accelerate generative AI innovations, the AI server leverages the NVIDIA HGX H100 baseboard with eight Tensor Core GPUs, each connected through extremely fast fourth-generation NVIDIA NVLink, to deliver 32 petaflops of performance for large-scale AI models, complex simulations, and massive datasets.
 
 
powerful (1)

Innovative Disaggregated Approach

The AI server can be paired with a single CPU head node server, creating a unified AI solution. This disaggregated GPU and CPU approach enhances flexibility, enabling you to select the CPU head node server that best suits your needs.
(Ingrasys head node solution: SV2121A/SV2121I)
 

 

hga_exp

High Expansion Capability

Adopting a one-GPU-to-one-NIC topology, the powerful server offers high expansion capability by accommodating 8 NICs, which enables fast interconnections between GPUs within a GPU cluster. In addition, its two-layer design features easy scalability up to 32 NVMe Drives and enhances thermal efficiency, allowing the server to deliver unprecedented performance and agility.
 

Certification

FCC
RCM
BSMI
CB
CE
UL
Supported GPU

NVIDIA HGX H100 8-GPU

Expansion Slots 20 x PCIe 5.0 x16 Slots
Storage 32 x Hot-swap U.2 NVMe Drive Bays
Front Panel

1 x Power LED, 1 x UID LED, 1 x Attention LED
1 x 1GbE/RJ45 Management Port
1 x RJ45 Console Port

Form Factor 10 U Rackmount
(4U IOB Sled with 4U GPU Sled)
Chassis Dimensions
(H x W x D)
17.3" x 20.0"x 37.3"/ 440.5mm x 508.0mm x 948.8mm

Management

1 x ASPEED AST2600
Power Supply 5+5 Redundant 3000W Platinum Power Supplies 
Fans 24 x 80*80mm for N+1 Cooling Redundancy
Certification CE/FCC/RCM/BSMI/UL/IECEE CB

Operating Temperature

10°C to 35°C (50°F to 95°F)
Non-operating Temperature     -40°C to 60°C (-40°F to 140°F)

Operating Relative Humidity 

8% to 85%RH
Non-operating Relative Humidity 5% to 95%RH

Related Products

GB10181A
  • AMD
category AI Accelerators

GB10181A

AI-Optimized GPU Server Powered by AMD Instinct™ MI300X Platform

8 x MI300X GPU OAM Modules
20 x PCIe Gen 5.0 x16 Slots
32 x U.2 NVMe Drive Bays
Air Cooling Solution
10 U

GB10281N
  • NVIDIA
category AI Accelerators

GB10281N

HGX H200 8-GPU Al Supercomputing Server to Bring Next-level GPU Performance

8 x NVIDIA HGX H200 GPUs
20 x PCIe Gen5 x16 Slots
32 x U.2 NVMe Drive Bays
Air Cooling Solution
10 U

TOP

We use cookies to allow our website to work properly, personalize content and advertising, provide social media features and analyze traffic. We also share information about your use of our site with our social media, advertising and analytics partners

Manage Cookies

Privacy preferences

We use cookies to allow our website to work properly, personalize content and advertising, provide social media features and analyze traffic. We also share information about your use of our site with our social media, advertising and analytics partners

Privacy Policy

Manage preferences

Necessary cookie

Always on

The website cannot function without these cookies and you cannot switch them off on your system. These cookies are typically set solely in response to an action you perform (i.e. a service request), such as setting privacy preferences, logging in, or filling in a form. You can set your browser to block or prompt you for these cookies, but this may prevent some site features from working.