Skip to main content

PNNL

  • About
  • News & Media
  • Careers
  • Events
  • Research
    • Scientific Discovery
      • Biology
        • Chemical Biology
        • Computational Biology
        • Ecosystem Science
        • Human Health
          • Cancer Biology
          • Exposure Science & Pathogen Biology
        • Integrative Omics
          • Advanced Metabolomics
          • Chemical Biology
          • Mass Spectrometry-Based Measurement Technologies
          • Spatial and Single-Cell Proteomics
          • Structural Biology
        • Microbiome Science
          • Biofuels & Bioproducts
          • Human Microbiome
          • Soil Microbiome
          • Synthetic Biology
        • Predictive Phenomics
      • Chemistry
        • Computational Chemistry
        • Chemical Separations
        • Chemical Physics
        • Catalysis
      • Earth & Coastal Sciences
        • Global Change
        • Atmospheric Science
          • Atmospheric Aerosols
          • Human-Earth System Interactions
          • Modeling Earth Systems
        • Coastal Science
        • Ecosystem Science
        • Subsurface Science
        • Terrestrial Aquatics
      • Materials Sciences
        • Materials in Extreme Environments
        • Precision Materials by Design
        • Science of Interfaces
        • Solid Phase Processing
          • Cold Spray
          • Friction Stir Welding & Processing
          • ShAPE
      • Nuclear & Particle Physics
        • Dark Matter
        • Flavor Physics
        • Fusion Energy Science
        • Neutrino Physics
      • Quantum Information Sciences
    • Energy Resiliency
      • Electric Grid Modernization
        • Emergency Response
        • Grid Analytics
          • AGM Program
          • Tools and Capabilities
        • Grid Architecture
        • Grid Cybersecurity
        • Grid Energy Storage
        • Transmission
        • Distribution
      • Energy Efficiency
        • Appliance and Equipment Standards
        • Building Energy Codes
        • Building Technologies
          • Advanced Building Controls
          • Advanced Lighting
          • Building-Grid Integration
        • Building and Grid Modeling
        • Commercial Buildings
        • Federal Buildings
          • Federal Performance Optimization
          • Resilience and Security
        • Grid Resilience and Decarbonization
        • Residential Buildings
          • Building America Solution Center
          • Energy Efficient Technology Integration
          • Home Energy Score
        • Energy Efficient Technology Integration
      • Energy Storage
        • Electrochemical Energy Storage
        • Flexible Loads and Generation
        • Grid Integration, Controls, and Architecture
        • Regulation, Policy, and Valuation
        • Science Supporting Energy Storage
        • Chemical Energy Storage
      • Environmental Management
        • Waste Processing
        • Radiation Measurement
        • Environmental Remediation
      • Fossil Energy
        • Subsurface Energy Systems
        • Carbon Management
          • Carbon Capture
          • Carbon Storage
          • Carbon Utilization
        • Advanced Hydrocarbon Conversion
      • Nuclear Energy
        • Fuel Cycle Research
        • Advanced Reactors
        • Reactor Operations
        • Reactor Licensing
      • Renewable Energy
        • Solar Energy
        • Wind Energy
          • Wind Resource Characterization
          • Wildlife and Wind
          • Community Values and Ocean Co-Use
          • Wind Systems Integration
          • Wind Data Management
          • Distributed Wind
        • Marine Energy
          • Environmental Monitoring for Marine Energy
          • Marine Biofouling and Corrosion
          • Marine Energy Resource Characterization
          • Testing for Marine Energy
          • The Blue Economy
        • Hydropower
          • Environmental Performance of Hydropower
          • Hydropower Cybersecurity and Digitalization
          • Hydropower and the Electric Grid
          • Materials Science for Hydropower
          • Pumped Storage Hydropower
          • Water + Hydropower Planning
        • Grid Integration of Renewable Energy
        • Geothermal Energy
      • Transportation
        • Bioenergy Technologies
          • Algal Biofuels
          • Aviation Biofuels
          • Waste-to-Energy and Products
        • Hydrogen & Fuel Cells
        • Vehicle Technologies
          • Emission Control
          • Energy-Efficient Mobility Systems
          • Lightweight Materials
          • Vehicle Electrification
          • Vehicle Grid Integration
    • National Security
      • Chemical & Biothreat Signatures
        • Contraband Detection
        • Pathogen Science & Detection
        • Explosives Detection
        • Threat-Agnostic Biodefense
      • Cybersecurity
        • Discovery and Insight
        • Proactive Defense
        • Trusted Systems
      • Nuclear Material Science
      • Nuclear Nonproliferation
        • Radiological & Nuclear Detection
        • Nuclear Forensics
        • Ultra-Sensitive Nuclear Measurements
        • Nuclear Explosion Monitoring
        • Global Nuclear & Radiological Security
      • Stakeholder Engagement
        • Disaster Recovery
        • Global Collaborations
        • Legislative and Regulatory Analysis
        • Technical Training
      • Systems Integration & Deployment
        • Additive Manufacturing
        • Deployed Technologies
        • Rapid Prototyping
        • Systems Engineering
      • Threat Analysis
        • Advanced Wireless Security
          • 5G Security
          • RF Signal Detection & Exploitation
        • Grid Resilience and Decarbonization
        • Internet of Things
        • Maritime Security
        • Millimeter Wave
        • Mission Risk and Resilience
    • Data Science & Computing
      • Artificial Intelligence
      • Graph and Data Analytics
      • Software Engineering
      • Computational Mathematics & Statistics
      • Future Computing Technologies
        • Adaptive Autonomous Systems
      • Visual Analytics
    • Publications & Reports
    • Featured Research
  • People
    • Inventors
    • Lab Leadership
    • Lab Fellows
    • Staff Accomplishments
  • Partner with PNNL
    • Education
      • Undergraduate Students
      • Graduate Students
      • Post-graduate Students
      • University Faculty
      • University Partnerships
      • K-12 Educators and Students
      • STEM Education
        • STEM Workforce Development
        • STEM Outreach
        • Meet the Team
      • Internships
    • Community
      • Regional Impact
      • Philanthropy
      • Volunteering
    • Industry
      • Available Technologies
      • Industry
      • Industry Partnerships
      • Licensing & Technology Transfer
      • Entrepreneurial Leave
      • Visual Intellectual Property Search (VIPS)
  • Facilities & Centers
    • All Facilities
      • Atmospheric Radiation Measurement User Facility
      • Electricity Infrastructure Operations Center
      • Energy Sciences Center
      • Environmental Molecular Sciences Laboratory
      • Grid Storage Launchpad
      • Institute for Integrated Catalysis
      • Interdiction Technology and Integration Laboratory
      • PNNL Portland Research Center
      • PNNL Seattle Research Center
      • PNNL-Sequim (Marine and Coastal Research)
      • Radiochemical Processing Laboratory
      • Shallow Underground Laboratory

Center for the Remediation of Complex Sites

  • About RemPlex
    • History
    • Leadership
    • Working with PNNL
  • Seminars
  • Engage
    • Learn and Study
  • Workshops
  • Past Summits
    • 2023 Summit
    • 2021 Summit
  • 2025 Summit
    • Case Studies
    • Technical Sessions
    • Sponsors

Breadcrumb

  1. Home
  2. Projects
  3. Center for the Remediation of Complex Sites
  4. RemPlex 2023 Summit

Technical Session 5

Environmental Data Management, Analysis, and Visualization
Wednesday, November 15, 2023 | 1:00 - 5:00 p.m. Pacific Time 

► Watch the recording:
Session 5: Environmental Data Management, Analysis, and Visualization, November 15, 2023
RemPlex Technical Session Five

Environmental restoration work requires management and integration of multiple types and formats of data from multiple authoritative sources (that may change over time), data analysis and interpretation, and effective data visualization. These aspects, as well as ensuring that environmental data is findable, accessible, interoperable, and reusable (FAIR), are important to support effective communication with site managers, regulators, and stakeholders; technically defensible remediation decision-making; and site operations. This session will focus on tools and approaches for management, access, analysis, and visualization of environmental data. Such data includes traditional monitoring data, geophysics data, remote sensing, near-real time monitoring, spatiotemporal analysis, and numerical model data. Topics will include data-driven analytics, development and application of software, 2D/3D data visualization, risk assessment, and evaluation of remedy performance.

Session Organizers: Christian D. Johnson, Pacific Northwest National Laboratory; Delphine Appriou, Pacific Northwest National Laboratory


1:00 - 1:05 p.m.

Opening Remarks

__________________________________________________ 

1:05 - 1:25 p.m.

Evaluating an Existing Data Management System

Samantha Bennett, ddms, Inc.

► PRESENTATION PDF

Environmental data management systems often grow organically as data needs evolve and additional use cases are realized. It is common for complex organizations with multiple systems to unknowingly develop redundant or inefficient workflows, many without robust documentation. These redundancies and inefficiencies can unnecessarily increase the cost and complexity of the organization’s overarching data system. 

A critical function of the U.S. DOE Office of Legacy Management (LM) is to protect human health and the environment through effective and efficient long-term surveillance and maintenance of over 100 post-cleanup sites in the United States. To fulfill this critical function, LM manages environmental data for hundreds of sites and continues to acquire legacy data from unique site databases as additional sites transfer into LM’s care. LM has limited staff to transition site data into its environmental data management system, therefore increasing the need for standardization of and strong documentation for both the incoming data and the data already in the system.  

This presentation explores best practices for (1) documenting and evaluating current environmental data workflows and (2) documenting suggested areas of improvements to those workflows. The team participated in a guided SWOT (Strengths, Weaknesses, Opportunities, and Threats) analysis and resource evaluation using documented procedures, interviews and workshops. They also documented current-state workflows using a series of diagrams and tables, identifying ways to potentially improve the quality, integrity, and accessibility of data in LM’s environmental data management system. In all, twenty-two existing workflows and process areas were documented and evaluated for potential improvements. 

LM has a robust and functional environmental data management system. In general, their data workflows are thorough, effective, and supported by a very knowledge able environmental data management team. However, the assessment still managed to identify several areas for improvement. 

Coauthors: Ellen Tomlinson (RSI EnTech, LLC, Contractor to the U.S. Department of Energy Office of Legacy Management), Annette Moore (U.S. Department of Energy Office of Legacy Management), Siobhan Kitchen (ddms Inc.) 
__________________________________________________

1:25 - 1:45 p.m.

Statistical Methods for Subsurface Decommissioning

Jennifer Huckett, Pacific Northwest National Laboratory 

► PRESENTATION PDF

Visual Sample Plan (VSP) software includes tools that implement the guidance outlined in Multi-Agency Radiation Survey and Site Investigation Manual (MARSSIM) (NUREG-1575) for data collection and statistical analysis to demonstrate compliance of the final status survey of decommissioning projects. Users can apply VSP to potentially contaminated surface soil as well as to areas where unexploded ordnance may exist. New modules and capabilities are continuously added based on users’ needs, including current research and development of tools to demonstrate that suspected contaminants in subsurface soil at a site is in compliance with release criteria for radiation dose- or risk-based regulations. As in surface soil cases, primary survey goals for the subsurface include reaching conclusions via hypothesis testing about whether decommissioning efforts led to sufficiently reduced radiation levels. Sufficient data and statistical modeling at this stage must also allow the principle responsible party (and regulator(s)) to identify areas with potential residual contamination for further investigation.
Whereas a large portion of a site’s area is accessible for scanning surface soils, subsurface analyses introduce challenges due to the difficulty and cost of accessing the subsurface. Addressing these challenges requires leveraging additional data sources including qualitative data (e.g., subject matter expertise, historical site assessments) in combination with quantitative data of varying resolutions and sources (e.g., soil contamination, groundwater characteristics, contaminant plumes, and fate and transport models) and data from sampling and analysis performed during decommissioning to inform final site survey sampling. In this presentation, we outline requirements for data sets resulting from merging disparate sources and present the statistical methods and remaining challenges recommended for future VSP releases.

Coauthors: Debbie Fagan, Lisa Newburn (Pacific Northwest National Laboratory)
__________________________________________________

1:45 - 2:05 p.m.

Leveraging Data-Driven Decision Making in a Large-Scale Field Study Simulating a Biological Agent Incident

Michael Pirhalla, United States Environmental Protection Agency

► PRESENTATION PDF

The Analysis for Coastal Operational Resiliency (AnCOR) program is an interagency effort involving the U.S. Environmental Protection Agency (EPA), Department of Homeland Security Science and Technology Directorate (DHS S&T), and the United States Coast Guard (USCG). This presentation will discuss events from the Wide Area Demonstration (WAD) field study of AnCOR. The WAD was conducted in May 2022 and operationally tested and evaluated options for decontamination, sampling, data management, and waste management for USCG assets and outdoor areas impacted by a widespread biological agent release. Over the course of four weeks, nearly 900 samples were collected consisting of various surface and grab sampling methods to determine background, pre-, and post-decontamination levels. One component of designing and executing a successful sampling plan is to have an effective suite of data management tools and systems. Novel techniques and strategies for data management were applied during the WAD, including detailing the roles, processes, and technologies for data acquisition, data management for sample collection, and visualization of results. Dressed in full personal protective equipment (PPE), three-person sampling teams entered the contaminated area with designated sampler, collector, and data management roles. The data manager was equipped with a sub-meter GPS unit and tablet connected to Esri Field Maps, which the team used to navigate to the sample collection locations. The technology allowed each sample to be precisely located and information describing the sample to be documented and uploaded to EPA’s Geoplatform, based through ArcGIS Online. Staff in the command center monitored real-time sample collection through a dashboard and communicated with the sampling teams via radio. Data management technologies also assisted with post-collection sample shipment and chain of custody generation. These tools and techniques are poised to enhance the efficiency of data management in response to a large-scale incident, integrating stages from field data collection to analysis and subsequent reporting.

Coauthors: Timothy Boe(United States Environmental Protection Agency, Office of Research and Development), Erin Silvestri (United States Environmental Protection Agency, Office of Research and Development), Shannon Serre (United States Environmental Protection Agency, Office of Land and Emergency Management), Jordan Deagan (Oak Ridge Associated Universities Student Services Contractor to United States Environmental Protection Agency), Matt Blaser (United States Environmental Protection Agency, Region 5)
__________________________________________________

2:05 - 2:25 p.m.

An Extensive Industrial Hygiene Data Analysis and Visualization (IDAV) Toolset

Hongfei Hou, Pacific Northwest National Laboratory

► PRESENTATION PDF

Industrial Hygiene (IH) monitoring and analysis plays a crucial role in safeguarding the health and safety of workers within the workplace. Advanced data analysis and visualization tools are highly desirable in making efficient use of the large quantities of IH monitoring data. However, state-of-the-art applications are constrained by limits on the efficiency of analyzing and visualizing large datasets. The Industrial Hygiene Data Analysis and Visualization (IDAV) tool is an integrated suite of scientifically grounded tools designed to address these limitations. IDAV grants access to multiple Hanford site databases and provides analytical tools that allow IH management and staff to comprehensively evaluate potential hazards and devise strategies for worker protection at hazardous waste sites. Sponsored by Washington River Protection Solutions, IDAV is a cloud-based application that conditions and renders data obtained from various IH surveys, including both tank vapor and non-vapor data. It encompasses readings from direct-reading instruments, such as stationary devices and wearable devices utilized by workers, as well as traditional air sampling methods. Utilizing continuously measured data, such as ammonia levels in worker breathing zones, in concert with sample-based data such as volatile organic compounds and mercury, IDAV provides rapid, high-quality assessment of exposure hazards and task risk evaluation. Extending the industrial hygienists’ data analysis capabilities, IDAV seamlessly integrates with Tableau software, enabling the creation of advanced charts and custom visualizations. Additionally, the IDAV tool suite utilizes Representational State Transfer (REST) APIs for data processing and transmission, reducing response times in the user interface. To maintain data privacy and security, IDAV incorporates OneID authentication. Currently, IDAV has been successfully deployed at the Hanford site, which stores approximately 56 million gallons of high-level radioactive and chemical waste. It has demonstrated remarkable efficacy in managing risk to workers from the vast array of 1,800 different chemicals present in Hanford's tank waste.

Coauthors: Sadie Montgomery (Pacific Northwest National Laboratory), Scott Upton (Pacific Northwest National Laboratory), Brett Simpson (Pacific Northwest National Laboratory), Scott Clingenpeel (Washington River Protection Solutions), Jason Reno (Washington River Protection Solutions), Eugene Morrey (Washington River Protection Solutions)
__________________________________________________

2:25 - 2:45 p.m.

Open Discussion

__________________________________________________

2:45 - 3:15 p.m.

Posters and Vendor Exhibit

__________________________________________________

3:15 - 3:35 p.m.

An Open-Source Information Model and Associated Cyberinfrastructure for Effective Environmental Management and Analytics

Roelof Versteeg, Subsurface Insights

► PRESENTATION PDF

Effective environmental restoration requires the use of heterogeneous data (in situ sensor data, time lapse electrical geophysical data, remote sensing data and sampling data) as well as the auditable and repeatable application of analytics and numerical modeling using this data. We will discuss the use of ODMX, an open source (odmx.org) data management solution for heterogeneous environmental data and the integration of ODMX with containerized pipelines for data analytics and numerical modeling. ODMX is currently used both for the LBNL SFA and for the SRNL Altemis project. ODMX uses automated pipelines for data harvesting from a range of standard sources into a powerful database which can then be accessed using a robust API. Once the database is populated the data in the database can be discovered and used by containerized numerical models which can be run periodically and be visualized interactively or automatically. This approach provides for a robust, automated and auditable approach which supports environmental restoration and monitoring.

Coauthors: Rebecca Rubinstein (Subsurface Insights), Reza Soltanian (University of Cincinnati), Doug Johnson (Subsurface Insights)
__________________________________________________

3:35 - 3:55 p.m.

A Tool to Assess Environmental Remediation Response to CO2 Leakage

Eusebius J Kutsienyo, Pacific Northwest National Laboratory

► PRESENTATION PDF

Geologic carbon storage (GCS) is used to securely store CO2 deep in geological formations, preventing its release into the atmosphere and mitigating global warming. The Department of Energy’s National Risk Assessment Partnership (NRAP) focuses on challenges of simulating the physical response of a GCS site to large-scale injection and storage over time by developing computational tools and workflows that can quantitatively assess the risks and potential liabilities associated with GCS, addressing critical stakeholder questions to support the deployment of commercial carbon capture and storage technology.
The current work focuses on risks related to potential CO2 and brine leakage from a storage reservoir and the resulting impacts on Underground Sources of Drinking Water (USDW). We are currently developing a leakage plume analysis tool as part of the NRAP program. This tool aims to provide an estimate of CO2 or brine plume extent and mass in the USDW that may occur in the event of fluid migration through the caprock of the CO2 storage system. By leveraging existing analytical solutions, the tool calculates the temporal and spatial distribution of the plume and helps the user to evaluate potential Environmental Remediation Responses (ERR) and assists in decision-making by providing estimates of leakage, remediation time and associated costs. The tool currently incorporates two ERRs: Pump and Treat, and Monitored Natural Attenuation. 
The web-based tool allows users to input site-specific parameters and assess the impact of individual or combined parameters on the potential cost of their selected ERR. The tool's output informs a financial cost model, which simulates project costs and assesses the financial risks and potential liabilities associated with long-term GCS projects. By addressing critical questions related to long-term risk and liability, this tool contributes to determining the pricing of project financial instruments like loans, bonds, and insurance products.

Coauthors: Pehman Rasouli (Pacific Northwest National Laboratory), Kyle Wilson (Pacific Northwest National Laboratory), Nicolas J Huerta (Pacific Northwest National Laboratory), Ashton Kirol (Pacific Northwest National Laboratory), Delphine Appriou (Pacific Northwest National Laboratory)
__________________________________________________

3:55 - 4:15 p.m.

Principal Component Analysis (PCA) To Compute Major Directions for 3D Spatial Analysis

Swasti Saxena, Pacific Northwest National Laboratory

► PRESENTATION PDF

Accurate spatial modeling is essential for analyzing and predicting the spread of contaminants throughout the work of investigating subsurface environmental remediation. Variogram models, which are commonly used in conventional spatial modeling techniques, assume isotropy by treating all directions as having the same degree of spatial variability. However, geology, groundwater flow, and human activity, among other things, can imply subsurface environments are anisotropic, exhibiting distinct qualities in different directions. Subsurface contamination dispersion models based on the isotropic assumption may be oversimplified and produce inaccurate results. Typical approaches to addressing anisotropy select one, two, or three major directions along which variation and hence the variograms may be unique. Existing literature suggests choosing directions based on visual inspection of variograms in numerous directions, assessment of the differences between variogram parameter estimates, and/or prior knowledge about subsurface characteristics.
In this presentation, we propose using principal component analysis (PCA) as a method to replace this subjective inspection process with an automated method with increased objectivity, based on collected data. Automating the selection of major directions for variogram fitting is particularly important for practitioners with expertise in contamination and remediation but limited experience with the statistical underpinnings and parameters associated with variogram analysis. Conventionally, PCA is used to reduce the dimensionality of complex datasets (i.e., with numerous interrelated variables) by decomposing the data into a smaller number of independent components (i.e., that represent the variation observed in all of the original variables but with fewer, independent new variables). We will discuss how PCA can be applied to subsurface datasets to determine three orthogonal major directions that bypasses the need for visual inspection and provides a means to generating anisotropic variograms for subsequent subsurface investigation and spatial analysis.

Coauthors: Moses Obiri (Pacific Northwest National Laboratory), Jen Huckett (Pacific Northwest National Laboratory), Deb Fagan (Pacific Northwest National Laboratory)
__________________________________________________

4:15 - 4:35 p.m.

Geoframework Models: A Force Multiplier

Tollef Winslow, Central Plateau Cleanup Company

► PRESENTATION PDF

Geoframework models are being used to make quicker and more informed decisions to support the remediation of contaminated soil and groundwater at the Hanford Nuclear Reservation, a legacy Manhattan Project Site. These three-dimensional subsurface representations have become a force multiplier for scientists with the Central Plateau Cleanup Company (CPCCo), under contract with the Department of Energy (DOE), who are tasked with cleaning up the 580 square mile site. With over 12,000 boreholes with borehole and geophysical logs and numerous seismic and geologic studies, it is difficult to analyze and correlate the data into a coherent geologic interpretation. To help with this task, CPCCo developed and maintains a geoframework model system which allows for the import of all of this valuable information. Visualization and standardization of the information allows for consistent geologic interpretation amongst geoscientists. The power of readily accessible, consistently interpreted geology allows scientists to plan wells with speed and flexibility and make drilling decisions in real time. Geoframework models have become a force multiplier allowing for safer operations, faster and more accurate decisions while reducing the number of personnel and time needed to make decisions and perform tasks. 
__________________________________________________

4:35 - 5:00 p.m.

Open Discussion and Closing Remarks

__________________________________________________

 

Return to Technical Sessions Overview

PNNL

  • Get in Touch
    • Contact
    • Careers
    • Doing Business
    • Environmental Reports
    • Security & Privacy
    • Vulnerability Disclosure Policy
  • Research
    • Scientific Discovery
    • Energy Resiliency
    • National Security
Subscribe to PNNL News
Department of Energy Logo Battelle Logo
Pacific Northwest National Laboratory (PNNL) is managed and operated by Battelle for the Department of Energy
  • YouTube
  • Facebook
  • X (formerly Twitter)
  • Instagram
  • LinkedIn