April 15, 2017
Conference Paper

Empirical Analysis of the Subjective Impressions and Objective Measures of Domain Scientists’ Analytical Judgment Using Visualizations

Abstract

Scientists working in a particular domain often adhere to conventional data analysis and presentation methods and this leads to familiarity with these methods over time. But does high familiarity always lead to better analytical judgment? This question is especially relevant when visualizations are used in scientific tasks, as there can be discrepancies between visualization best practices and domain conventions. However, there is little empirical evidence of the relationships between scientists’ subjective impressions about familiar and unfamiliar visualizations and objective measures of their effect on scientific judgment. To address this gap and to study these factors, we focus on the climate science domain, specifically on visualizations used for comparison of model performance. We present a comprehensive user study with 47 climate scientists where we explored the following factors: i) relationships between scientists’ familiarity, their perceived levels of com- fort, confidence, accuracy, and objective measures of accuracy, and ii) relationships among domain experience, visualization familiarity, and post-study preference.

Revised: June 2, 2017 | Published: April 15, 2017

Citation

Dasgupta A., S.M. Burrows, K. Han, and P.J. Rasch. 2017. Empirical Analysis of the Subjective Impressions and Objective Measures of Domain Scientists’ Analytical Judgment Using Visualizations. In Proceedings of the CHI Conference on Human Factors in Computing Systems (CHI 2017), May 6-11, 2017, Denver, Colorado, 1193-1204. New York, New York:ACM. PNNL-SA-121199. doi:10.1145/3025453.3025882