Unifying Combinatorial and Graphical Methods in Artificial Intelligence (AI)
PIs: Roberto Gioiosa, Stephen Young, Sinan Aksoy, Helen Jenne, Bill Kay, Jenna Pope, Madelyn Shapiro
Objective
- Develop efficient methods to use subject matter expert informed graph neural networks (GNNs) to approximate the potentials of atomistic configurations for developing:
- novel inner product Laplacian framework to integrate SME knowledge into GNN message passing framework
- COMET compiler approaches to leverage post-Moore architectures and problem sparsity
Overview
This project aims to enhance the comprehension of graph neural networks (GNNs) through the application of cutting-edge graph theoretical findings to boost the efficiency and scalability of GNNs. By incorporating domain expertise and strategic approaches into GNNs, we strive to develop and implement mathematical theories and computational capabilities to expedite and optimize the training and performance of these pathways.

Aim 1: Examine Fourier and harmonic analysis for graphs and other combinatorial objects where the inner product, Laplacian, will be studied to generalize spectral graph theory techniques in the context of inner product spaces. Our aim is to develop, in a principled manner, methods to incorporate non-combinatorial data into combinatorial analysis and provide unification of several existing Laplacians.
Aim 2: Explore the implementations of Fourier methods applied to GNNs within frameworks like PyTorch, tapping into the advanced computing capabilities of Pacific Northwest National Laboratory. Additionally, we will conduct in-depth scaling studies on innovative techniques that delve into future Advanced Scientific Computing Research topics, focusing on novel and converged architectures for data-centric computing, as well as AI inference and training on the edge.
Impact
- Limitations of current state-of-the-art include:
- neural network potential methods which rely on long training process to “learn” important features of atom types
- GNN implementations which fail to leverage sparsity or post-Moore architectures
- Incorporating extant SME information and leveraging sparsity and post-Moore architectures both have the potential to significantly improve quality and efficient of training leading to the ability to use GNNs for larger systems.
- First step towards physics-informed GNNs, which synthesize scientific knowledge and machine learning framework.
Publications and Presentations
- Bilbrey J.A., S.J. Young, S.G. Aksoy, W.W. Kay, M.R. Shapiro, H. Lee, and R. Gioiosa, et al. 01/06/2024. "Enhanced Molecular Graph Embeddings With Inner Product Laplacians?." Presented by J.A. Bilbrey at 2024 Joint Mathematics Meetings (JMM 2024), San Francisco, California.
- Young S.J., S.G. Aksoy, A.S. Bittner, B. Fang, R. Gioiosa, H. Jenne, and W.W. Kay, et al. 09/27/2023. "Reimagining Spectral Graph Theory." Presented by S.J. Young at ACO Alumni Speaker Series, Atlanta, Georgia.
- Young S.J., S.G. Aksoy, A.S. Bittner, B. Fang, R. Gioiosa, H. Jenne, and W.W. Kay, et al. 11/13/2023. "Reimagining Spectral Graph Theory." Presented by S.J. Young at Texas A&M Institute of Data Science Seminar, College Station, Texas.