HPE unveils Mod Pod AI 'data center-in-a-box' at Nvidia GTC
Briefly

The article discusses Hewlett Packard Enterprise's (HPE) introduction of Mod Pod, a modular data center optimized for AI and high-performance computing (HPC) workloads. Presenting this at Nvidia's GTC conference, HPE highlights its potential to be deployed easily, requiring minimal changes to existing data center setups. Mod Pod features liquid cooling technology, significantly improving power usage effectiveness (PUE) and reducing total ownership costs. Additionally, HPE announced advancements in its partnership with Nvidia regarding Private Cloud AI, signaling ongoing innovation in AI investment and capabilities for businesses globally.
"A lot of data center space that does exist, does not have the capabilities for liquid cooling, which means you don't have the density in your racks, and you also don't have the PUE (power usage effectiveness)," said HPE CTO Fidelma Russo. "So [Mod Pod] gives you a lower total cost of ownership."
"We have examples of our customers, siting these in parking lots where they used to have employees, but with the work from home from COVID, they have the space," she added. "So again it's easy, [you've] just got to level some space and you can have a data center in your backyard up and running in months."
Mod Pod comes in 6m and 12m configurations, and supports up to 1.5MW per unit with a PUE of under 1.1. While HPE is keen to highlight its liquid-cooling credentials, the Adaptive Cascade Cooling technology can be adapted to use either air or liquid cooling depending on customer need and preference.
In addition to Mod Pod, HPE also announced several new features in Private Cloud AI, the flagship - and, thus far, only - product from its partnership with the chipmaker, Nvidia AI Computing by HPE.
Read at Cloud Pro
[
|
]