December 1, 2023

Joining Forces for Trustworthy Artificial Intelligence

PNNL named one of the founding partners of the Trillion Parameter Consortium

Image of multiple hands stacked together overlaid with binary code

Dozens of international partners joined Trillion Parameter Consortium.

(Composite image by Nathan Johnson | Pacific Northwest National Laboratory)

Pacific Northwest National Laboratory (PNNL) joined dozens of international partners from National Laboratories, research institutes, academia, and industry, to form the Trillion Parameter Consortium (TPC). The newly-established TPC aims to advance artificial intelligence (AI) through trustworthy and reliable models for scientific discovery.

PNNL’s efforts will be led by Neeraj Kumar, chief data scientist in the Advanced Computing, Mathematics, and Data division.

“National laboratories within the Department of Energy bring impressive computing power and expertise to the TPC,” said Kumar. “We can leverage DOE’s exascale systems to train AI models on scientific data.”

Though many generative AI models exist today—from chatbots to art generators—creating trustworthy models for science remains a greater challenge.

“To train AI models for science, we need high-quality, trusted data to start with,” said Kumar. “Using our domain expertise in fields, such as climate science, biology, materials science and molecular chemistry, PNNL researchers can develop an extensive data corpus and advance models in these areas”

Several other PNNL researchers—including Po-Lun Ma, Andrew McNaughton, Sameera Horawalavithana, Gihan Panapitiya, and Anurag Acharya—are also involved in TPC efforts. Each brings a different area of expertise to the TPC project, from molecular biology, to Earth system modeling, to applying machine learning techniques to catalysis.

As part of the TPC, researchers will advance progress on scientific and engineering problems by sharing methods, approaches, tools, insights, and workflows with others from across the globe. They will also curate and prepare data from various scientific disciplines and design and evaluate models. Their endeavor will ultimately result large language models trained with a trillion parameters—a size only the very largest commercial AI systems have yet to accomplish.

“Such a lofty goal requires the collaboration of many researchers and many hours of computation time,” said Kumar. “Our researchers are proud to contribute our expertise as founding partners in this effort.”