Science laboratories and sophisticated simulations are producing data of increasing volumes and complexities, and that’s posing significant challenges to current data infrastructures as terabytes to petabytes of data must be processed and analyzed. Traditional computing platforms, originally designed to support model-driven applications, are unable to meet the demands of the data-intensive scientific applications. Pacific Northwest National Laboratory (PNNL) research goes beyond “traditional supercomputing” applications to address emerging problems that need scalable, real-time solutions. The outcome is new unconventional architectures for data-intensive applications specifically designed to process the deluge of scientific data, including FPGAs, multithreaded architectures and IBM's Cell.
Revised: October 24, 2007 |
Published: June 15, 2007
Citation
Nieplocha J., A. Marquez, F. Petrini, and D. Chavarría-Miranda. 2007.Unconventional Architectures for High-Throughput Sciences.SciDAC Review.PNNL-SA-55797.