Scientists commonly describe their data processing systems metaphorically as software pipelines. These pipelines input one or more data sources and apply a sequence of processing steps to transform the data and create useful results. While conceptually simple, pipelines often adopt complex topologies and must meet stringent quality of service requirements that place stress on the software infrastructure used to construct the pipeline. In this paper we describe the MeDICi Integration Framework, which is a component-based framework for constructing complex software pipelines. The framework supports composing pipelines from distributed heterogeneous software components and provides mechanisms for controlling qualities of service to meet demanding performance, reliability and communication requirements.
Revised: May 4, 2011 |
Published: February 24, 2011
Citation
Gorton I., A.S. Wynne, Y. Liu, and J. Yin. 2011.Components in the Pipeline.IEEE Software 28, no. 3:34-40.PNNL-SA-77807.doi:10.1109/MS.2011.23