Covering Scientific & Technical AI | Monday, December 23, 2024

PNNL: Collaborations Are Key to Shaping the Future of AI 

Nov. 13, 2024 -- Over the past few years, the adoption of artificial intelligence (AI) by the general public has rapidly increased with the development of AI chatbots. Since they are developed to understand natural languages, these large language model (LLM)-based programs feature easy-to-use interfaces. However, these programs are not without their flaws.

“For AI models to improve, they need training,” said Neeraj Kumar, chief data scientist at Pacific Northwest National Laboratory (PNNL). “Training state-of-the-art AI models today, especially large language models and foundation models, is incredibly computationally demanding and energy intensive.”

Kumar recently highlighted the challenges and opportunities in LLM development—including energy efficiency—at the AI Hardware & Edge AI Summit 2024 in San Jose, California. Alongside other global leaders in the fields of AI and hardware, Kumar delivered a keynote speech at a pre-conference event called the Efficient Generative AI Summit. He also participated in a panel discussion called “Emerging Architectures for Applications Using LLMs – The Transition to LLM Agents” during the main conference.

PNNL hosted a variety of AI-focused events with industry partners, including AWS GameDay, pictured here.
(Photo by Graham Bourque | Pacific Northwest National Laboratory)

“LLMs have the transformative potential to unlock new frontiers in computational power and energy efficiency,” said Kumar. “During the panel discussion, participants highlighted the rapid evolution of LLM technologies, their growing impact across industries, and the importance of collaboration between industry, academia, and national laboratories to advance these technologies in a robust and trustworthy manner.”

Kumar emphasized that addressing the computational demands and energy consumption of training large AI models requires a concerted, collaborative effort in the recent highlights from TechArena. “We need to develop more efficient algorithms and hardware solutions to make AI more sustainable,” he added.

PNNL researchers recognize the importance of these cross-sector alliances. The Center for AI@PNNL, where Kumar holds a position on the advisory board, regularly hosts events with industry partners to foster partnerships with PNNL researchers.

“Since its inception in December 2023, the Center organized multiple large language model days with companies such as AWS, Azure, and NVIDIA, and is planning similar events with other industry leaders,” said Courtney Corley, director of the Center for AI @PNNL. “These events proved to be a successful way to facilitate conversations and cultivate collaborations between staff at PNNL and industry partners. These industry engagements have resulted in staff at PNNL engaging in collaborative research and planned future projects.”

To hear more about innovations in AI, listen to the “AI Innovation, Energy Efficiency & Future Trends” TechArena podcast in which Kumar discusses AI with Allyson Klein.


Source: Sarah Wong, PNNL

AIwire