Technology Overview
Whether in economics, energy, climate, homeland security, or other areas, analysts use models to predict the future. But which model will reliably produce the most accurate results? The inherent uncertainly in choosing the right set of interpretations, processes, and mathematical systems is the greatest source of error and risk associated with modeling and forecasting.
PNNL has developed a powerful statistical framework to address this uncertainty, through the aggregate prediction of a model ensemble. These ensembles comprise individual models (algorithms, mathematical calculations, simulations, expert opinion, etc.). Each model uniquely explores a portion of a hypothesis by defining a set of processes, systems, and relationships that describe the area of interest. For example, a model that calculates residential energy demand could paired with usage statistics over the last ten years and industrial trends to predict overall energy use. The framework was piloted in a similar situation, resulting in a 30% to 55% decrease in result errors for the forecast.
PNNL’s framework relies on techniques such as bootstrap aggregating, boosting, or Bayesian averaging for better predictions. The results exhibit less bias than any of the individual models because the aggregate is derived from a weighted combination of all ensemble parts.
PNNL’s statistical framework guides an iterative process to assemble, design, evaluate, and optimize the collection of models. The aggregate can adaptively combine the strengths of different models continuously and in real time to address a variety of scenarios. The approach has been used successfully to forecast electric industry use but could be applied to a wide variety of disciplines and domains.
Advantages
- Leads to more accurate estimates for prediction and forecasting
- Is extremely adaptable across domains and scenarios
- Operates continuously and in real time