Technical Session 5
Environmental Data Management, Analysis, and Visualization
Wednesday, November 15, 2023 | 1:00 - 5:00 p.m. Pacific Time
► Watch the recording:
Environmental restoration work requires management and integration of multiple types and formats of data from multiple authoritative sources (that may change over time), data analysis and interpretation, and effective data visualization. These aspects, as well as ensuring that environmental data is findable, accessible, interoperable, and reusable (FAIR), are important to support effective communication with site managers, regulators, and stakeholders; technically defensible remediation decision-making; and site operations. This session will focus on tools and approaches for management, access, analysis, and visualization of environmental data. Such data includes traditional monitoring data, geophysics data, remote sensing, near-real time monitoring, spatiotemporal analysis, and numerical model data. Topics will include data-driven analytics, development and application of software, 2D/3D data visualization, risk assessment, and evaluation of remedy performance.
Session Organizers: Christian D. Johnson, Pacific Northwest National Laboratory; Delphine Appriou, Pacific Northwest National Laboratory
1:00 - 1:05 p.m. |
Opening Remarks __________________________________________________ |
1:05 - 1:25 p.m. Evaluating an Existing Data Management System Samantha Bennett, ddms, Inc. ► PRESENTATION PDF |
Environmental data management systems often grow organically as data needs evolve and additional use cases are realized. It is common for complex organizations with multiple systems to unknowingly develop redundant or inefficient workflows, many without robust documentation. These redundancies and inefficiencies can unnecessarily increase the cost and complexity of the organization’s overarching data system. A critical function of the U.S. DOE Office of Legacy Management (LM) is to protect human health and the environment through effective and efficient long-term surveillance and maintenance of over 100 post-cleanup sites in the United States. To fulfill this critical function, LM manages environmental data for hundreds of sites and continues to acquire legacy data from unique site databases as additional sites transfer into LM’s care. LM has limited staff to transition site data into its environmental data management system, therefore increasing the need for standardization of and strong documentation for both the incoming data and the data already in the system. This presentation explores best practices for (1) documenting and evaluating current environmental data workflows and (2) documenting suggested areas of improvements to those workflows. The team participated in a guided SWOT (Strengths, Weaknesses, Opportunities, and Threats) analysis and resource evaluation using documented procedures, interviews and workshops. They also documented current-state workflows using a series of diagrams and tables, identifying ways to potentially improve the quality, integrity, and accessibility of data in LM’s environmental data management system. In all, twenty-two existing workflows and process areas were documented and evaluated for potential improvements. LM has a robust and functional environmental data management system. In general, their data workflows are thorough, effective, and supported by a very knowledge able environmental data management team. However, the assessment still managed to identify several areas for improvement. Coauthors: Ellen Tomlinson (RSI EnTech, LLC, Contractor to the U.S. Department of Energy Office of Legacy Management), Annette Moore (U.S. Department of Energy Office of Legacy Management), Siobhan Kitchen (ddms Inc.) |
1:25 - 1:45 p.m. Statistical Methods for Subsurface Decommissioning Jennifer Huckett, Pacific Northwest National Laboratory ► PRESENTATION PDF |
Visual Sample Plan (VSP) software includes tools that implement the guidance outlined in Multi-Agency Radiation Survey and Site Investigation Manual (MARSSIM) (NUREG-1575) for data collection and statistical analysis to demonstrate compliance of the final status survey of decommissioning projects. Users can apply VSP to potentially contaminated surface soil as well as to areas where unexploded ordnance may exist. New modules and capabilities are continuously added based on users’ needs, including current research and development of tools to demonstrate that suspected contaminants in subsurface soil at a site is in compliance with release criteria for radiation dose- or risk-based regulations. As in surface soil cases, primary survey goals for the subsurface include reaching conclusions via hypothesis testing about whether decommissioning efforts led to sufficiently reduced radiation levels. Sufficient data and statistical modeling at this stage must also allow the principle responsible party (and regulator(s)) to identify areas with potential residual contamination for further investigation. Coauthors: Debbie Fagan, Lisa Newburn (Pacific Northwest National Laboratory) |
1:45 - 2:05 p.m. Leveraging Data-Driven Decision Making in a Large-Scale Field Study Simulating a Biological Agent Incident Michael Pirhalla, United States Environmental Protection Agency ► PRESENTATION PDF |
The Analysis for Coastal Operational Resiliency (AnCOR) program is an interagency effort involving the U.S. Environmental Protection Agency (EPA), Department of Homeland Security Science and Technology Directorate (DHS S&T), and the United States Coast Guard (USCG). This presentation will discuss events from the Wide Area Demonstration (WAD) field study of AnCOR. The WAD was conducted in May 2022 and operationally tested and evaluated options for decontamination, sampling, data management, and waste management for USCG assets and outdoor areas impacted by a widespread biological agent release. Over the course of four weeks, nearly 900 samples were collected consisting of various surface and grab sampling methods to determine background, pre-, and post-decontamination levels. One component of designing and executing a successful sampling plan is to have an effective suite of data management tools and systems. Novel techniques and strategies for data management were applied during the WAD, including detailing the roles, processes, and technologies for data acquisition, data management for sample collection, and visualization of results. Dressed in full personal protective equipment (PPE), three-person sampling teams entered the contaminated area with designated sampler, collector, and data management roles. The data manager was equipped with a sub-meter GPS unit and tablet connected to Esri Field Maps, which the team used to navigate to the sample collection locations. The technology allowed each sample to be precisely located and information describing the sample to be documented and uploaded to EPA’s Geoplatform, based through ArcGIS Online. Staff in the command center monitored real-time sample collection through a dashboard and communicated with the sampling teams via radio. Data management technologies also assisted with post-collection sample shipment and chain of custody generation. These tools and techniques are poised to enhance the efficiency of data management in response to a large-scale incident, integrating stages from field data collection to analysis and subsequent reporting. Coauthors: Timothy Boe(United States Environmental Protection Agency, Office of Research and Development), Erin Silvestri (United States Environmental Protection Agency, Office of Research and Development), Shannon Serre (United States Environmental Protection Agency, Office of Land and Emergency Management), Jordan Deagan (Oak Ridge Associated Universities Student Services Contractor to United States Environmental Protection Agency), Matt Blaser (United States Environmental Protection Agency, Region 5) |
2:05 - 2:25 p.m. An Extensive Industrial Hygiene Data Analysis and Visualization (IDAV) Toolset Hongfei Hou, Pacific Northwest National Laboratory ► PRESENTATION PDF |
Industrial Hygiene (IH) monitoring and analysis plays a crucial role in safeguarding the health and safety of workers within the workplace. Advanced data analysis and visualization tools are highly desirable in making efficient use of the large quantities of IH monitoring data. However, state-of-the-art applications are constrained by limits on the efficiency of analyzing and visualizing large datasets. The Industrial Hygiene Data Analysis and Visualization (IDAV) tool is an integrated suite of scientifically grounded tools designed to address these limitations. IDAV grants access to multiple Hanford site databases and provides analytical tools that allow IH management and staff to comprehensively evaluate potential hazards and devise strategies for worker protection at hazardous waste sites. Sponsored by Washington River Protection Solutions, IDAV is a cloud-based application that conditions and renders data obtained from various IH surveys, including both tank vapor and non-vapor data. It encompasses readings from direct-reading instruments, such as stationary devices and wearable devices utilized by workers, as well as traditional air sampling methods. Utilizing continuously measured data, such as ammonia levels in worker breathing zones, in concert with sample-based data such as volatile organic compounds and mercury, IDAV provides rapid, high-quality assessment of exposure hazards and task risk evaluation. Extending the industrial hygienists’ data analysis capabilities, IDAV seamlessly integrates with Tableau software, enabling the creation of advanced charts and custom visualizations. Additionally, the IDAV tool suite utilizes Representational State Transfer (REST) APIs for data processing and transmission, reducing response times in the user interface. To maintain data privacy and security, IDAV incorporates OneID authentication. Currently, IDAV has been successfully deployed at the Hanford site, which stores approximately 56 million gallons of high-level radioactive and chemical waste. It has demonstrated remarkable efficacy in managing risk to workers from the vast array of 1,800 different chemicals present in Hanford's tank waste. Coauthors: Sadie Montgomery (Pacific Northwest National Laboratory), Scott Upton (Pacific Northwest National Laboratory), Brett Simpson (Pacific Northwest National Laboratory), Scott Clingenpeel (Washington River Protection Solutions), Jason Reno (Washington River Protection Solutions), Eugene Morrey (Washington River Protection Solutions) |
2:25 - 2:45 p.m. |
Open Discussion __________________________________________________ |
2:45 - 3:15 p.m. |
Posters and Vendor Exhibit __________________________________________________ |
3:15 - 3:35 p.m. An Open-Source Information Model and Associated Cyberinfrastructure for Effective Environmental Management and Analytics Roelof Versteeg, Subsurface Insights ► PRESENTATION PDF |
Effective environmental restoration requires the use of heterogeneous data (in situ sensor data, time lapse electrical geophysical data, remote sensing data and sampling data) as well as the auditable and repeatable application of analytics and numerical modeling using this data. We will discuss the use of ODMX, an open source (odmx.org) data management solution for heterogeneous environmental data and the integration of ODMX with containerized pipelines for data analytics and numerical modeling. ODMX is currently used both for the LBNL SFA and for the SRNL Altemis project. ODMX uses automated pipelines for data harvesting from a range of standard sources into a powerful database which can then be accessed using a robust API. Once the database is populated the data in the database can be discovered and used by containerized numerical models which can be run periodically and be visualized interactively or automatically. This approach provides for a robust, automated and auditable approach which supports environmental restoration and monitoring. Coauthors: Rebecca Rubinstein (Subsurface Insights), Reza Soltanian (University of Cincinnati), Doug Johnson (Subsurface Insights) |
3:35 - 3:55 p.m. A Tool to Assess Environmental Remediation Response to CO2 Leakage Eusebius J Kutsienyo, Pacific Northwest National Laboratory ► PRESENTATION PDF |
Geologic carbon storage (GCS) is used to securely store CO2 deep in geological formations, preventing its release into the atmosphere and mitigating global warming. The Department of Energy’s National Risk Assessment Partnership (NRAP) focuses on challenges of simulating the physical response of a GCS site to large-scale injection and storage over time by developing computational tools and workflows that can quantitatively assess the risks and potential liabilities associated with GCS, addressing critical stakeholder questions to support the deployment of commercial carbon capture and storage technology. Coauthors: Pehman Rasouli (Pacific Northwest National Laboratory), Kyle Wilson (Pacific Northwest National Laboratory), Nicolas J Huerta (Pacific Northwest National Laboratory), Ashton Kirol (Pacific Northwest National Laboratory), Delphine Appriou (Pacific Northwest National Laboratory) |
3:55 - 4:15 p.m. Principal Component Analysis (PCA) To Compute Major Directions for 3D Spatial Analysis Swasti Saxena, Pacific Northwest National Laboratory ► PRESENTATION PDF |
Accurate spatial modeling is essential for analyzing and predicting the spread of contaminants throughout the work of investigating subsurface environmental remediation. Variogram models, which are commonly used in conventional spatial modeling techniques, assume isotropy by treating all directions as having the same degree of spatial variability. However, geology, groundwater flow, and human activity, among other things, can imply subsurface environments are anisotropic, exhibiting distinct qualities in different directions. Subsurface contamination dispersion models based on the isotropic assumption may be oversimplified and produce inaccurate results. Typical approaches to addressing anisotropy select one, two, or three major directions along which variation and hence the variograms may be unique. Existing literature suggests choosing directions based on visual inspection of variograms in numerous directions, assessment of the differences between variogram parameter estimates, and/or prior knowledge about subsurface characteristics. Coauthors: Moses Obiri (Pacific Northwest National Laboratory), Jen Huckett (Pacific Northwest National Laboratory), Deb Fagan (Pacific Northwest National Laboratory) |
4:15 - 4:35 p.m. Geoframework Models: A Force Multiplier Tollef Winslow, Central Plateau Cleanup Company ► PRESENTATION PDF |
Geoframework models are being used to make quicker and more informed decisions to support the remediation of contaminated soil and groundwater at the Hanford Nuclear Reservation, a legacy Manhattan Project Site. These three-dimensional subsurface representations have become a force multiplier for scientists with the Central Plateau Cleanup Company (CPCCo), under contract with the Department of Energy (DOE), who are tasked with cleaning up the 580 square mile site. With over 12,000 boreholes with borehole and geophysical logs and numerous seismic and geologic studies, it is difficult to analyze and correlate the data into a coherent geologic interpretation. To help with this task, CPCCo developed and maintains a geoframework model system which allows for the import of all of this valuable information. Visualization and standardization of the information allows for consistent geologic interpretation amongst geoscientists. The power of readily accessible, consistently interpreted geology allows scientists to plan wells with speed and flexibility and make drilling decisions in real time. Geoframework models have become a force multiplier allowing for safer operations, faster and more accurate decisions while reducing the number of personnel and time needed to make decisions and perform tasks. |
4:35 - 5:00 p.m. |
Open Discussion and Closing Remarks __________________________________________________ |