Covering Scientific & Technical AI | Sunday, November 24, 2024

NCSA Industry Conference Recap – Part 2 

Source: NCSA

Industry Program Director Brendan McGinty welcomed guests to the annual National Center for Supercomputing Applications (NCSA) Industry Conference, Oct. 9-11 in Urbana, Illinois. One hundred eighty from 40 organizations registered for the invitation-only, two-day event.

Day two opened with a keynote address by NCSA Director William Gropp, who explained how advanced computation has changed over the years, and where it’s heading. He acknowledged that specialization has always driven how computers were designed, but it was largely ignored.

“Many were preoccupied with Moore’s Law, which was always more of an imperative than a law,” he said. He noted that today’s Top500 list illustrates how specialization rules; the six most powerful systems on that list employ specialized processors which drive architecture, software and algorithms.

NCSA supports democratized instrumentation and data initiatives to serve federated, and inter-federated research communities, according to Gropp. Through these associations, their portfolio has expanded, “in new and exciting ways.” For example, NCSA will serve as the global central hub for the Large Synoptic Survey Telescope (LSST), and will be responsible for processing, archiving and serving 15 terabytes of raw image data collected each night of the 10-year survey. “LSST is expected to spawn a burgeoning telescope industry that will naturally look to NCSA to shape its future,” he said. NCSA supports the Midwest Big Data Hub; NSF’s first investment in data-sharing, as well as its third-generation compute-sharing project, the Extreme Science and Engineering Discovery Environment (XSEDE), led by Executive Director John Towns (UIUC Deputy CIO for Research IT).

William Gropp, Director, NCSA

With quantum capability expected within five years, NCSA is always experimenting with new platforms and can advise partners when the use of accelerators, like GPUs or FPGAs, are recommended, for example. “Cloud-enabled resources have changed the high-performance computing (HPC) landscape,” said Gropp.

He emphasized the importance of conducting a feasibility assessment when deciding whether to host cloud systems. “Unless you have a lot of work that’s perfect for a dedicated cloud resource and can keep them busy, it makes sense to find a provider,” he said.

Following Gropp’s presentation on Thursday, Christopher Alix, President of Prairie City Computing, led a session titled, “Emerging Technologies in Artificial Intelligence (AI).”

UIUC Assistant Professor Chenhui Shao (Dept. of Mechanical Science and Engineering) described how improvements in communications, connectedness, computing, sensors, and AI have revolutionized manufacturing. “Smart manufacturing is addressing the challenges of data fidelity, real-time learning and dynamic decision-making across multiple levels,” Shao said, and added, “While supercomputing is essential for big data analytics, smaller facilities do not have access to sufficient computing capability; this is another way that NCSA can help.”

NCSA Eliu Huerta is a UIUC Computational Science and Engineering Fellow and leads the NCSA Gravity Group. His presentation was titled, “Frontiers at the Interface of Deep Learning (DL) and HPC.” Since earning a PhD in Theoretical Astrophysics from the University of Cambridge, Huerta has gained experience with large-scale computing, deep and machine learning, theory, and modeling.

Huerta presented a scientific visualization of the collision of two neutron stars that was first observed in gravitational waves by the Laser Interferometer Gravitational-Wave Observatory (LIGO) and Virgo detectors on August 17, 2017. “The DL algorithms NCSA has pioneered enable the observation of these spectacular events, both in gravitational waves and light, faster than real time. As NCSA continues to drive innovation at the interface of DL and HPC, we are laying the foundation to enable discovery at scale and in real-time when LSST and gravitational wave detectors perform simultaneous observations of these multi-messenger events,” he said.

Technical Assistant Director and Research Professor of Mechanical Engineering Seid Koric is Blue Waters’ most prolific researcher. He has published more than 60 research papers, and 40 pertained to research conducted on Blue Waters with industrial and academic collaborators from around the world. Among the many awards he has received, are several HPCwire Readers’ and Editors’ Choice Awards, including Best Use of an HPC Application in Manufacturing.

Koric chaired the session titled, “Industry Application Domain Updates.” He explained that NCSA domain specialists work at the industry partner’s pace, and under strict nondisclosure agreements. “We do what we can to ensure the work is delivered on time, and under budget,” he said. NCSA offers specialization in visualization; modeling and simulation; bioinformatics and genomics; data analytics and AI; cyberinfrastructure and cybersecurity; code profiling and optimization; and more. “With more than 200 FTEs, there are always a broad range of specialists available,” said Koric. And it’s a diverse community of practice; he mentioned that among 40 active collaborations, eight native languages are represented.

NCSA Technical Program Manager Liudmila Mainzer

Technical Program Manager Liudmila Mainzer’s presentation focused on how NCSA works with industry partners to optimize workflows. She also showcased some of the computational genomics software programs that are available, but cautioned that the field is not yet “fully baked.” She said, “Methodologies are developing at a rapid rate; while it’s a challenge to keep up, we stay abreast of the resources that our partners need.”

Senior Database Architect Dora Cai described how her team develops time-saving solutions that translate to increased profits. One project reduced runtime from 3.5 hours to nine minutes using a parallelized, random forest algorithm. Another went from 175 days to four hours through use of a parallelized simulation program. Cai said, “DL is more accurate than random forest—95 percent vs. 49 percent—but the layers required with DL add time.”

Technical Program Manager Ahmed Taha described the achievements of NCSA’s modeling and simulation team. They support the new Chicago-based Digital Manufacturing and Design Innovation Institute (DMDII); an applied research institute that will develop a range of products for consumers and the U.S. Dept. of Defense. Taha said, “DMDII, which partners with UIUC College of Engineering and NCSA, is supported by a $70 million DoD investment, and more than $250 million from industry, academia, government and community partners.

Gropp and several presenters mentioned NCSA’s cloud-enabled resource, Clowder; an open-source data-management tool that supports long-tail data that can be hosted in the cloud, on the desktop or via custom instance built with the assistance of NCSA experts. NCSA’s Brown Dog tool is a science-driven, web-based, data-wrangling service that facilitates the extraction and representation of metadata from a variety of domains.

Partnership Showcase Panel

A special panel explored industry highlights from both technical and industry perspectives. Panel Moderator Andrew Jones (Vice President of HPC Numerical Algorithms Group) asked audience members if they had questions, concerns or additional success stories to share.

One guest expressed concern that in-house technical staff might worry that their jobs would be in jeopardy if their company sought an alliance with NCSA, “And that’s who would normally steer such leadership decisions.” Another expressed concern about the unintended consequences of AI, which led to a deeper discussion about the importance of interdisciplinary teams with representation from the social sciences—something NCSA and their partners have established a track record for. Mark Brandyberry (Illinois Rocstar) said, “NCSA was an easy choice for us. Compute capability and data science evolve quicker than our in-house techs can train,” he said. “It’s likely someone at NCSA knows how to solve whatever problem we’re currently struggling with,” he added. To Brandyberry’s comment, someone said, “not everybody needs a supercomputer though; some just want Python to perform better on their laptop, and NCSA experts can help with that, too.”

NCSA Industry Program History; Video Highlights:

Several NCSA Industry science and engineering highlights are described in their latest video, including some recent accomplishments by Caterpillar, Boeing and Eli Lilly. Additionally, their team assisted with the following:

  • Allstate redesigned its claim-handling processes via NCSA-mastered knowledge discovery and data-mining techniques.
  • Motorola worked with NCSA to simulate conditions surrounding dropped calls; the project led to the development of best practices which were officially adopted as a telecommunications industry standard.
  • Rolls Royce allied with NCSA to profile, parallelize and modernize legacy codes used by transcontinental shipping clients; their work resulted in a 30 percent reduction in run-time which translates to more efficient supply-chains, and better-informed decisions associated with maintenance and scheduling.
  • FMC Corporation realized a 450 percent cost reduction for their client, FedEx, through the use of NCSA-produced modeling simulations.

You’ll find part one of the NCSA Industry Conference recap in HPCwire, including a summary of a keynote by Steven Demuth (Mayo Clinic Chief Technology Officer), and an introduction to the NCSA Viz Team.

AIwire