Covering Scientific & Technical AI | Friday, December 27, 2024

Adaptive Computing to Integrate Intel HPC Distribution for Apache Hadoop 

Adaptive Computing today announced it is integrating its Moab/TORQUE workload management software with the Intel HPC Distribution for Apache Hadoop software, which combines the Intel Distribution for Apache Hadoop software with the Intel Enterprise Edition of Lustre software.

The integration marks a milestone in the big data ecosystem by enabling Hadoop and HPC workloads to run together on the same infrastructure, faster and easier than in isolation. The combined technologies from Intel and Adaptive improve job launch speeds as well as Hadoop processing speeds to increase efficiency and performance of big data workloads in HPC environments.

"The solution allows customers to leverage both their HPC and big data investments in a single platform, as opposed to operating them in siloed environments," said Michael Jackson, president and co-founder of Adaptive Computing. "The convergence between big data and HPC environments will only grow stronger as organizations demand data processing models capable of extracting the results required to make data-driven decisions."

"In our efforts to expand the usage of Hadoop for a variety of applications, we focused on the need for an efficient infrastructure that can run Hadoop workloads on HPC systems," said Girish Juneja, General Manager, Big Data Software at Intel. "Adaptive's capabilities around Moab/TORQUE naturally align with our efforts to expand the ecosystem."

The Intel HPC Distribution for Apache Hadoop software is an open source software platform for big data processing and storage built from the hardware up to deliver industry leading performance, multi-layered security and enterprise-grade manageability. The solution addresses the growing adoption of big data analytics and HPC systems in enterprises as well as research institutions.

Enterprises as well as research labs are looking for a flexible software platform that allows big data analytics applications that are based on Apache Hadoop software to access data that is located on HPC storage systems. Just as importantly, organizations expect the same performance and manageability from Hadoop workloads that they get from orchestrating HPC workloads today.

AIwire