September 18, 2024
Article

A Unified Mathematical Theory for Barren Plateaus

A formula explains challenges in quantum machine learning

Image of quantum machine learning models

Researchers have proposed a mathematical theory to address the challenge of barren plateaus in quantum machine learning. 

(Image by Melanie Hess | Pacific Northwest National Laboratory)

The term “barren plateaus” suggests an empty, deserted, and flat landscape. This term is used in quantum computing to describe a phenomenon where a class of tunable quantum models exhibit a vast and barren landscape of appropriate models for a given problem. The barren plateau phenomenon is one of the main challenges behind the field of quantum machine learning and it places a tremendous limitation on scalability of quantum models. In a recent publication in Nature Communications, researchers from Pacific Northwest National Laboratory (PNNL), North Carolina State University, and Los Alamos National Laboratory present a mathematical theory to address the challenge. 

“Our goal was to mathematically formalize some of the common problems we see in quantum machine learning. Specifically, we looked at barren plateaus, which are commonly observed when you’re optimizing or training quantum models,” said Carlos Ortiz Marrero, data scientist at PNNL.  

Variational quantum computing schemes have received considerable attention due to their high versatility and ability to be executed on noisy intermediate-scale quantum devices. However, despite their promise, these algorithms can exhibit barren plateaus. In a pivotal shift, Ortiz Marrero and team, including former PNNL intern Michael Ragone, stumbled upon a unifying equation that explains the emergence of barren plateaus.

In their paper, “A Lie Algebraic Theory of Barren Plateaus for Deep Parameterized Quantum Circuits,” they used the theory of Lie algebras to derive an exact expression for the variance of the loss function of a quantum model and outlined how, from this expression, a user can explain the exponential decay of the variance as a quantum model scales (i.e., a barren plateau) due to noise, entanglement, and complex model architecture.

Figure summarizing a theorem for understanding barren plateaus in quantum machine learning
A team of researchers from Pacific Northwest National Laboratory, North Carolina State University, and Los Alamos National Laboratory devised a mathematical theory to calculate barren plateaus experienced in quantum machine learning. (Figure by Martin Larocca and Marco Cerezo | Los Alamos National Laboratory)

This work builds on a previous publication in which Ortiz Marrero, Nathan Wiebe, a joint appointee between University of Toronto and PNNL, and Mária Kieferová of the University of Technology Sydney discovered the emergence of barren plateaus due to too much entanglement in a quantum model. 

This work was funded by the Mathematics for Artificial Reasoning in Science initiative led by Mark Raugas: https://www.pnnl.gov/projects/mars.

Published: September 18, 2024

Ragone, M., B. N. Bakalov, F. Sauvage, A. F. Kemper, C. Ortiz Marrero, M. Larocca, and M. Cerezo. 2024. “A Lie Algebraic Theory of Barren Plateaus for Deep Parameterized Quantum Circuits.” Nature Communications 15 (1): 7172. https://doi.org/10.1038/s41467-024-49909-3.