Covering Scientific & Technical AI | Sunday, December 22, 2024

CoreWeave Leads Cloud Market with Launch of NVIDIA H200 Tensor Core GPUs 

ROSELAND, N.J., Aug. 28, 2024 -- CoreWeave today announced that it is the first cloud provider to bring NVIDIA H200 Tensor Core GPUs to market. CoreWeave has a proven track record of being first to market with large-scale AI infrastructure, and was among the first to deliver a large-scale NVIDIA H100 Tensor Core GPU cluster interconnected with NVIDIA Quantum-2 InfiniBand networking, which broke MLPerf training records in June 2023. Today, CoreWeave's infrastructure services are used to train some of the largest and most ambitious models from customers including Cohere, Mistral, and NovelAI.

The NVIDIA H200 Tensor Core GPU is designed to push the boundaries of generative AI by providing 4.8 TB/s memory bandwidth and 141 GB GPU memory capacity that helps deliver up to 1.9X higher inference performance than H100 GPUs. CoreWeave's H200 instances combine NVIDIA H200 GPUs with Intel's fifth-generation Xeon CPUs (Emerald Rapids) and 3200Gbps of NVIDIA Quantum-2 InfiniBand networking, and are deployed in clusters with up to 42,000 GPUs and accelerated storage solutions to deliver powerful performance and enable customers to dramatically lower their time and cost to train their GenAI models.

"CoreWeave is dedicated to pushing the boundaries of AI development and, through our long-standing collaboration with NVIDIA, is now first to market with high-performance, scalable, and resilient infrastructure with NVIDIA H200 GPUs," said Michael Intrator, CEO and co-founder of CoreWeave. "The combination of H200 GPUs with our technology empowers customers to tackle the most complex AI models with unprecedented efficiency, and to achieve new levels of performance."

CoreWeave's Mission Control platform offers customers unmatched reliability and resiliency by managing the complexities of AI infrastructure deployment and uptime with software automation. The platform helps customers train models faster and more efficiently by using advanced system validation processes, proactive fleet health-checking, and extensive monitoring capabilities. CoreWeave's rich suite of observability tools and services provides transparency across all the critical components of the system, empowering teams to maintain uninterrupted AI development pipelines. This translates to reduced system downtime, faster time to solution and lower total cost of ownership.

"CoreWeave has a proven track record of deploying NVIDIA technology rapidly and efficiently, ensuring that customers have the latest cutting-edge technology to train and run large language models for generative AI," said Ian Buck, vice president of Hyperscale and HPC at NVIDIA. "With NVLink and NVSwitch, as well as its increased memory capabilities, the H200 is designed to accelerate the most demanding AI tasks. When paired with the CoreWeave platform powered by Mission Control, the H200 provides customers with advanced AI infrastructure that will be the backbone of innovation across the industry."

In addition to bringing the latest NVIDIA GPUs to market and advancing its portfolio of cloud services, CoreWeave is rapidly scaling its data center operations to keep up with demand for its industry-leading infrastructure services. CoreWeave has completed nine new data center builds since the beginning of 2024, with 11 more in progress. The company expects to end the year with 28 data centers globally, with an additional 10 new data centers planned in 2025.

About CoreWeave

CoreWeave, the AI Hyperscaler, delivers a cloud platform of cutting-edge software powering the next wave of AI. The company's technology provides enterprises and leading AI labs with the most performant and efficient cloud solutions for accelerated computing. Since 2017, CoreWeave has operated a growing footprint of data centers covering every region of the US and across Europe. CoreWeave was ranked as one of the TIME100 most influential companies and featured on Forbes Cloud 100 ranking in 2024. Learn more at www.coreweave.com.


Source: CoreWeave

AIwire