Inspur Announces AI Server Support for NVIDIA PCIe Tensor Core GPUs
HAMBURG, Germany, June 1, 2022 — Inspur Information, a leading IT infrastructure solutions provider, announced at ISC 2022 that it will be among the first to support the recently announced liquid-cooled NVIDIA A100 and H100 PCle Tensor Core GPUs. As digital transformation and environmental sustainability become increasingly intertwined, Inspur is pioneering greener and more powerful liquid-cooled computing systems for enterprises and industries.
Inspur AI servers, including NF5468M6 and NF5468A5, support up to eight liquid-cooled NVIDIA PCIe GPUs. Inspur was the first server vendor to offer eight liquid-cooled 500W HGX A100 GPUs in its NF5488LA5 and NF5688LA5 servers, and now Inspur is also among the first to support liquid-cooled PCIe GPUs in its product portfolio.
Inspur AI liquid-cooled servers have strong AI computing and general computing capabilities and can be flexibly configured according to user needs. They provide strong computing for AI and HPC applications including image recognition, speech recognition, natural language processing, scientific research and engineering simulation. The cold plates cover high power components like the GPUs and CPUs, and the server adopts warm water cooling technologies, allowing the power usage effectiveness (PUE) of Inspur servers to be as low as 1.1. This results in a substantial reduction in operating costs for cooling equipment.
“The rapid growth of data center computing power and power consumption highlights the importance of liquid cooling,” said Liu Jun, Vice President of Inspur Information and General Manager of AI and HPC. “Our liquid-cooled AI servers offer powerful and green solutions that make it easier to build next-generation intelligent data centers and hyperscale with tremendous density and performance, while reducing energy costs and being more environmentally sustainable.”
“The growing demand for mainstream systems that can effectively run AI applications such as training and inference require powerful GPUs,” said Paresh Kharya, Senior Director of Product Management for Accelerated Computing at NVIDIA. “Inspur’s systems powered by liquid-cooled NVIDIA A100 and H100 PCIe GPUs will enable customers to achieve higher performance on these workloads, while improving energy efficiency in the data center.”
Inspur Information is a leading AI server provider, with a rich portfolio of AI computing products, and works closely with its AI customers to help them achieve incredible performance improvements in AI applications including voice, semantics, image, video, and search processing.
About Inspur Information
Inspur Information is a leading provider of data center infrastructure, cloud computing, and AI solutions. It is the world’s 2nd largest server manufacturer. Through engineering and innovation, Inspur Information delivers cutting-edge computing hardware design and extensive product offerings to address important technology sectors such as open computing, cloud data center, and AI. Performance-optimized and purpose-built, our world-class solutions empower customers to tackle specific workloads and real-world challenges. To learn more, visit https://www.inspursystems.com.
Source: Inspur Information