Conference

Annual Conference on Neural Information Processing Systems (NeurIPS) 2024

Join Pacific Northwest National Laboratory (PNNL) at the NeurIPS Conference in Vancouver, BC!

Hero image for NeurIPS 2024
December 10–15, 2024

Vancouver, BC

Data scientists and engineers from Pacific Northwest National Laboratory (PNNL) will be at the Thirty-ninth Conference on Neural Information Processing Systems (NeurIPS) to present talks, lead workshops, and participate in competitions.

The annual NeurIPS conference brings together researchers from various fields, including machine learning (ML), neuroscience, life sciences, and statistics, among many others. The remarkable advancements in ML and artificial intelligence (AI) have paved the way for a new era of applications, revolutionizing various aspects of our daily lives. From situational awareness to analyzing and detecting threats and interpreting online signals to ensure system reliability, researchers at PNNL are at the forefront of scientific exploration and national security, harnessing the power of AI to tackle complex scientific problems.

Featured PNNL Presentations

Saturday, December 14

Machine Learning meets Algebraic Combinatorics: A Suite of Datasets to Accelerate AI for Mathematics Research

Session: 8:15 a.m.–5:30 p.m. PST | Location: West Meeting Room 118-120

PNNL Contributors: Helen Jenne, Davis Brown, Jesse He, Mark Raugas, and Henry Kvinge

Summary:  Presented during MATH-AI: The 4th Workshop on Mathematical Reasoning and AI, this talk will discuss how machine learning's (ML's) impact on mathematics has grown, but existing tools often fail to meet mathematicians' needs. To address this, researchers from PNNL and University of Washington introduce the Algebraic Combinatorics Dataset Repository (ACD Repo), featuring datasets tied to classic and open problems. These benchmarks offer complexity and depth, guiding ML methods tailored for advanced mathematical research. Read more.

RHAAPsody: RHEED Heuristic Adaptive Automation Platform Framework for Molecular Beam Epitaxy Synthesis

Session: 8:15 a.m.–5:30 p.m. PST | Location: West Meeting Room 211-214

PNNL Speaker: Sarah Akers

Summary: Presented during the AI4Mat-2024: NeurIPS 2024 Workshop on AI for Accelerated Materials Design, this talk will present research on how molecular beam epitaxy (MBE) enables precise thin-film synthesis, but analyzing RHEED patterns is complex. Researchers at PNNL developed an AI pipeline for real-time monitoring of MBE via RHEED image analysis, detecting meaningful changes with advanced time–series and graph–based methods. This approach offers a foundation for automated feedback control during film deposition. Read more.

Towards Autonomous Nanomaterials Synthesis via Reaction-Diffusion Coupling

Session: 8:15 a.m.–5:30 p.m. PST | Location: West Meeting Room 211-214

PNNL Speakers: Andrew Ritchhart, Elias Nakouzi,  Maxim Ziatdinov

Summary: Presented during the AI4Mat-2024: NeurIPS 2024 Workshop on AI for Accelerated Materials Design, this work, performed under the AT SCALE initiative, explains how reaction–diffusion coupling enables precise nanomaterial synthesis with controlled chemical and structural gradients. Using an automated lab, researchers at PNNL studied copper hydroxide precipitation patterns, including Liesegang bands, characterized for machine learning analysis. The active learning system correlates reaction conditions to patterns and dynamically refines parameters, aiming for autonomous, target–driven pattern optimization. Read more.

 

Sunday, December 15

Thinking Fast and Laterally: Multi-Agentic Approach for Reasoning about Uncertain Emerging Events

Session: 8:15 a.m.–5:30 p.m. PST | Location: East Ballroom B

PNNL Contributors: Stefan Dernbach, Alejandro Michel, Khushbu Agarwal, Sutanay Choudhury

Summary: Presented during the System-2 Reasoning at Scale workshop, this talk explores System-2 reasoning in AI through lateral thinking, emphasizing anticipatory and causal reasoning under uncertainty. It introduces SALT, a multi-agent framework for handling complex queries in streaming data. SALT's dynamic communication structure shows promise in outperforming single-agent systems for intricate reasoning tasks, based on initial evaluations. Contributors also include Christopher Brissette and Geetika Gupta from NVIDIA.  Their contribution is to develop an optimized version of a distributed in-memory vector database and scheduling algorithms to maximize streaming throughput. Read more.

Emulating the Global Change Analysis Model with Deep Learning

Session: 8:15 a.m.–5:30 p.m. PST | Location: East Ballroom C

PNNL Contributors: Matt Jensen, Claudia Tebaldi, Abigail Snyder, Brian Hutchinson

Summary: This talk, presented during the Tackling Climate Change with Machine Learning workshop, will discuss The Global Change Analysis Model (GCAM), which explores human-Earth system interactions but is computationally expensive for large–scale uncertainty analyses. Researchers from PNNL developed a deep learning emulator trained on GCAM ensembles, achieving high predictive accuracy (median R² = 0.998). This efficient tool enables broader scenario discovery, enhancing insights into energy, land, and water system pathways.

Featured PNNL Workshop

Sunday, December 15

Scientific Methods for Understanding Neural Networks: Discovering, Validating, and Falsifying Theories of Deep Learning with Experiments

Portrait of Davis Brown

Session: 8:15 a.m. PST | Location: West Meeting Room 205-207

PNNL Organizer: Davis Brown

Summary: This workshop explores deep learning through the scientific method, emphasizing empirical analysis to validate or challenge existing theories. By focusing on controlled experiments, it aims to uncover principles behind deep learning successes and failures. A secondary goal is fostering a collaborative community to advance both theoretical and practical understanding of deep networks. Learn more.

 

Careers at PNNL

If you're interested in working at PNNL, take a look at our open positions!