September 10, 2013
Journal Article

A Case for Soft Error Detection and Correction in Computational Chemistry

Abstract

High performance computing platforms are expected to deliver 10(18) floating operations per second by the year 2022 through the deployment of millions of cores. Even if every core is highly reliable the sheer number of the them will mean that the mean time between failures will become so short that most applications runs will suffer at least one fault. In particular soft errors caused by intermittent incorrect behavior of the hardware are a concern as they lead to silent data corruption. In this paper we investigate the impact of soft errors on optimization algorithms using Hartree-Fock as a particular example. Optimization algorithms iteratively reduce the error in the initial guess to reach the intended solution. Therefore they may intuitively appear to be resilient to soft errors. Our results show that this is true for soft errors of small magnitudes but not for large errors. We suggest error detection and correction mechanisms for different classes of data structures. The results obtained with these mechanisms indicate that we can correct more than 95% of the soft errors at moderate increases in the computational cost.

Revised: September 27, 2013 | Published: September 10, 2013

Citation

van Dam H.J., A. Vishnu, and W.A. De Jong. 2013. A Case for Soft Error Detection and Correction in Computational Chemistry. Journal of Chemical Theory and Computation 9, no. 9:3995-4005. PNNL-SA-96159. doi:10.1021/ct400489c