RemPlex 2025 Summit - Technical Session - Data Management
Data Management as a Foundation for Environmental Decision-Making
November 4, 2025, 1:00 p.m.

Successful environmental remediation depends on informed decision-making at every level, from field operations to executive planning. Data management plays a crucial role in ensuring data availability for analysis, optimizing resource allocation, reducing costs, and maximizing environmental outcomes.
This session will explore the key attributes of robust data management—Visibility, Accessibility, Understandability, Interoperability, Trustworthiness, and Security (VAULTS)—and their impact on remediation success. By integrating accurate, complete, current, and secure data, organizations can prioritize actions, eliminate redundancy, and focus investments where they deliver the greatest environmental and economic benefits.
Effective data management enables users to transform raw environmental data into actionable insights, streamline workflows, enhance collaboration across teams, identify high-priority remediation areas, optimize resource allocation, and leverage interoperable systems to link datasets and improve transparency. It also builds confidence in decision-making by ensuring information is accurate and reliable. This session will highlight innovative tools and methodologies that enhance remediation efforts while ensuring cost-effective outcomes.
This session will cover topics such as improving adherence to FAIR (Findable, Accessible, Interoperable, and Reusable) data principles, advancing tools and techniques for data loading, processing, quality assurance, and traceability, developing innovative methods for interpreting environmental data, and sharing best practices and lessons learned in environmental data management.
Join us to explore how cutting-edge approaches to data management are shaping the next generation of environmental remediation.
Session Organizers: Siobhan Kitchen, ddms, Inc.; Annette Moore, U.S. Department of Energy, Office of Legacy Management
1:00 - 1:05 p.m. | Opening Remarks |
1:05 - 1:25 p.m. Environmental Data Management at Los Alamos National Laboratory Sean Sandborgh, Newport News Nuclear BWXT-Los Alamos, LLC (N3B-Los Alamos) | Newport News Nuclear BWXT-Los Alamos, LLC (N3B) implements the U.S. Department of Energy’s Office of Environmental Management Los Alamos Legacy Cleanup Contract (LLCC) at Los Alamos National Laboratory (LANL). LLCC water, soil, rock, soil vapor, and debris samples produce over three million data points annually. This data set includes, but is not limited to, analytical chemistry results, groundwater levels, field measurements, lithology data, field sample collection data, and location information. N3B’s data collection and management strategy consists of interconnected systems. LANL Environmental data is stored in the cloud-based Environmental Information Management (EIM) database, which is shared and managed by the EIM Software Change Control Board (SCB). SCB entities are responsible for collecting environmental data within and surrounding LANL for environmental remediation and monitoring, as well as cleanup and oversight activities. Field data can be input either directly into EIM or uploaded through the Locus Mobile iOS application. Analytical laboratory results are uploaded to EIM by the laboratories and then reviewed by chemists and data managers to verify and validate the data. IntellusNM.com is the public web portal of EIM, which provides continuous access to LANL’s environmental data. IntellusNM offers user-friendly query capabilities for regulators, external stakeholders and scientists, and members of the public to access data that is also present in EIM. The tools used at N3B allow for full transparency and accessibility of environmental data generated at LANL, resulting in increased accountability and trust with LANL’s stakeholders. Coauthors: William Donaldson, John P. Garrett, Angelica Maestas, Paul Mark, Jen Patureau, Helen Westbrook, and Corey White (N3B-Los Alamos) |
1:25 - 1:45 p.m. EQuIS: Transforming Environmental Data Governance, Automation, and Insight for Sustainable Decision-Making Taylor Ziolkowski, EarthSoft, Inc. | Effective environmental remediation requires more than data—it demands systems that transform information into actionable insights. EarthSoft’s EQuIS platform enables this transformation by embedding strong data governance, automation, and sustainability into environmental data management. It supports complex datasets across air, water, soil, and biota monitoring programs while ensuring data remains visible, accessible, understandable, interoperable, trustworthy, and secure (VAULTS). EQuIS enforces data quality through configurable business rules, automated workflows, and role-based access controls. These features reduce manual effort, improve compliance, and accelerate decision-making. Its hybrid architecture—supporting centralized databases and federated access—enables cross-system analysis without duplication, enhancing collaboration and transparency. Two key tools within the EQuIS ecosystem—PlanEngage and Helios—extend its capabilities. PlanEngage allows users to visualize environmental and geotechnical data through interactive dashboards, GIS layers, and narrative storytelling. This dynamic, web-based approach improves understanding, supports field planning, and facilitates communication with regulators and communities. By presenting data in an intuitive format, PlanEngage builds trust and promotes transparency—critical for sustainable remediation. Helios addresses the challenge of unstructured data, extracting insights from field notes, PDFs, images, and other non-tabular sources using Microsoft AI services. It can summarize documents, identify sensitive content, and integrate findings into broader workflows. This expands the range of data available for decision-making, improving site characterization, and remediation planning. Together, these tools support a data ecosystem aligned with FAIR (Findable, Accessible, Interoperable, and Reusable) principles and adaptable to evolving environmental challenges. EQuIS helps organizations reduce redundancy, optimize resources, and treat environmental data as a strategic asset—laying the foundation for informed, cost-effective, and sustainable remediation. Coauthor: Dan Alexander (EarthSoft, Inc.) |
1:45 - 2:05 p.m. Insights in Environmental Data Management Planning: Lessons Learned and Best Practices Samantha Bennett, ddms, Inc. | It is no secret that a good environmental data management system (EDMS) can lower costs, increase efficiency, build confidence in data, and lead to new insights through leveraging an organization’s data assets. Properly implemented EDMSs allow organizations to make the best decisions for everything, from site characterization to site remediation to informed regulatory reporting. Planning a successful EDMS blueprint starts with aligning data collection and management plans with the questions the organization needs to answer and the issues the organization needs to resolve. An EDMS needs to be adaptable to a seemingly ever-increasing list of requirements and complexities, making data more accessible, more instantaneous, and more transformable into actionable information. Developing and implementing a successful EDMS can be a huge challenge. It requires a team that is committed to making the transition, and a unicorn; This person has the experience to bridge the gap between data technology, real-world field data collection, data analysis, data reporting and visualization. In this presentation we will review lessons learned and best practices related to gathering requirements, technology selection, stakeholder engagement and standardization. |
2:05 - 2:25 p.m. Beyond CAS: A Shared Global Reference System to Ensure FAIR (Findable, Accessible, Interoperable, and Reusable) Data Management in the Environmental Industry Jane Mathisen, EarthScience Information Systems (EScIS) | Environmental data management solution (EDMS) providers often rely on CAS (Chemical Abstracts Service) numbers as chemical identifiers, yet CAS coverage gaps and ambiguities with mixtures or chemical classes pose significant challenges. Many environmental analytes such as transformation products, total petroleum hydrocarbons (TPHs), or proprietary mixtures, lack CAS numbers altogether. Additionally, groups of compounds like polychlorinated biphenyls (PCBs) are represented by multiple identifiers, complicating data aggregation and interpretation. In practice, data not represented by a CAS number are being stored using various nomenclature as dictated by the end-user, often varying on a project-by-project basis. This adds both a significant burden to laboratories being required to maintain multiple libraries for such values on a by-client or by-project basis, and increases costs associated with transferring data between data owners over time. As the common denominator for data management, we suggest that EDMS providers play an important role in addressing variation in non-CAS environmental data management by mandating the use of a shared global reference system at the laboratory level such that Electronic Data Deliverables (EDDs) are remitted to end-users with FAIR use in mind and without mandating end-users be FAIR subject matter experts. Coauthors: Nick Tumney (EScIS) |
2:25 - 2:45 p.m. | Open Discussion |
2:45 - 3:15 p.m. | BREAK |
3:15 - 3:35 p.m. Hanford Data Visualization and Analysis in SOCRATES to Support Remedy Performance Assessment and Optimization Christian Johnson, Pacific Northwest National Laboratory | The ability to visualize and analyze environmental data is critical to supporting communication, planning, and decision making in environmental remediation. The SOCRATES web application suite of modules provides tools for visualization and analysis of data for the U.S. Department of Energy Hanford Site in southeastern Washington state. While there are many ongoing cleanup activities at Hanford, remediation of carbon tetrachloride (CT), technetium-99, and other contaminants in the groundwater aquifer in the Central Plateau region is a significant effort. A pump-and-treat (P&T) remedy has been implemented to capture the roughly 15 km² CT plume (and other contaminants) and reduce concentrations over a 25-year time frame to achieve levels where the remedy can transition to monitored natural attenuation. To meet objectives for capture and removal of CT mass from the aquifer, it is important to assess the P&T remedy by examining extraction well performance and using numerical predictive simulations. The HYPATIA module of SOCRATES has been configured to give users direct access to the key P&T extraction well performance information for pre-defined groups of wells. A panel of plots for a selected well group visualizes groundwater concentrations, pumping rates, water levels, and/or mass recovery for each well, easily allowing the user to identify higher versus lower performing wells with respect to contaminant mass extraction. This information supports assessment of the overall remedy performance and guides decisions about how to optimize the well network locations and pumping rates to maintain effective contaminant mass removal. The ORIGEN module offers complementary functionality for visualization of the three-dimensional (3D) nature of subsurface hydrogeology and contaminant plume distributions. Numerical simulation results of CT plume distribution over time can be visualized as isosurfaces or 3D scatter plot data within the 3D context of the Hanford geologic model. These results are important for communication, understanding remedy performance, and feeding into optimization analysis for the P&T system. This presentation will illustrate the features of the HYPATIA and ORIGEN tools and how they facilitate assessment of remedies and support the remediation decision-making process. Coauthors: Delphine Appriou (consultant); Jennifer Fanning, Jesus Fernandez, Frank Lopez Jr., Hung Luu, Ashton Kirol, Angelica Vargas, Reem Osman, Sophie Baur, Ross Cao, Tycko Franklin, and Patrick Royer (Pacific Northwest National Laboratory) |
3:35 - 3:55 p.m. Integrated Spatial Analytics and Real-Time Data Management for Optimized Soil Reuse in Large-Scale Industrial Remediation Victoria Ward, Woodard & Curran, Inc. | Soil management classification and tracking are major challenges when performing redevelopment-driven remediation at former industrial sites. At a large, historically industrial site in Connecticut undergoing redevelopment, our team has implemented a data-driven soil management strategy that prioritizes on-site reuse, minimizing environmental impact. While maximizing on-site reuse of soil reduces the environmental impact of remediation, it increases the complexity of the soil management puzzle, especially at a large site where over 4,000 soil samples have been collected over the past several decades. To meet this challenge, we implemented a site-wide grid system and developed a series of Python scripts to classify soil samples and grid cells based on regulatory exceedance thresholds. Excavated soil is then assigned to stockpiles and final grid cells based on a combination of factors: the need for fill to meet a final redevelopment grade and whether the soil management category aligns with the planned future land use at that grid cell. For instance, soil consistent with native material may be used as fill anywhere on Site, but soil in exceedance of certain regulatory criteria may need to be covered by several feet of clean fill, a paved surface, or a building. In addition to the scripted approach to soil classification, we developed field data collection tools using ESRI’s Field Maps technology to track soil excavation, stockpiling, and filling during redevelopment work. This approach enables real-time tracking, precise delineation, and adaptive management of soil to facilitate a redevelopment-based remediation approach. This presentation will highlight how automation, spatial analysis, and field-based decision-making tools are being leveraged to streamline remediation. The methodology not only enhances efficiency, but also offers a replicable model for other large, complex remediation projects. Coauthors: Sylvia Rathmell, Eric Sneesby, Dan Brockmeyer, Sam Olney, Nick Hastings, and Katherine Elich (Woodard & Curran, Inc.) |
3:55 - 4:15 p.m. Utilizing Automated Data Review to Evaluate Large Data Sets at Los Alamos National Laboratory Corey White, Newport News Nuclear BWXT-Los Alamos, LLC | Newport News Nuclear BWXT-Los Alamos, LLC (N3B) implements the U.S. Department of Energy’s Office of Environmental Management Los Alamos Legacy Cleanup Contract at Los Alamos National Laboratory (LANL). In 2024, N3B met several milestones in support of environmental remediation and monitoring programs at LANL. This effort required N3B chemists to review approximately two million analytical data points from multiple analyses and various matrices on an expedited schedule. The quality and defensibility of analytical data from third party, off-site laboratories are key components of effective remediation and monitoring. At N3B, data quality is ensured through a review process that includes data examination, verification, and validation. The examination and verification steps are completed simultaneously by an N3B chemist with the assistance of the automated data review (ADR) module in N3B’s environmental information management (EIM) database. Processing the electronic data deliverable (EDD) through the ADR module allows chemists to quickly determine if all the requested data have been returned by the laboratory. It identifies and qualifies deficiencies in analytical data using quality control (QC) samples reported by the laboratory in the EDD. The ADR module also allows for an efficient check against stored action limits and historical data, where applicable. The chemist uses the ADR module to ensure that the EDD agrees with the data package and to determine if there are significant data quality deficiencies that require a more intensive review through manual validation. The automation is invaluable, given that many individual data sets contain over 5,000 field sample and QC records. The EIM system also randomly selects a percentage of data sets for manual validation, determined during the sample planning phase of the project. The ADR system, with validation by experienced chemists, provides stakeholders with confidence in the quality of data being used to make cleanup decisions at LANL. Coauthors: Dr. Sean Sandborgh, Jen Patureau, Helen Westbrook, Dr. John Garrett, Paul Mark, Angelica Maestas, and William Donaldson (N3B-Los Alamos) |
4:15 - 4:35 p.m. REMPOR Web Application for Access to Site and Technology Information Christian Johnson, Pacific Northwest National Laboratory | The global inventory of operational nuclear reactors, including their lifecycle stages and decommissioning status, is well-documented and readily accessible. In contrast, comprehensive data on radioactively contaminated sites undergoing remediation or requiring remediation remains fragmented and inconsistently reported. While some International Atomic Energy Agency (IAEA) Member States maintain national-level records, a centralized, global repository is lacking. To address this information gap, the IAEA has initiated the development of a comprehensive knowledge platform focused on remediation efforts. Beyond the mere cataloging of contaminated sites, the initiative emphasizes the systematic documentation of remediation strategies, including contaminant profiles, spatial extent of contamination, technologies employed, associated costs, and implementation outcomes. This information is critical for supporting evidence-based decision-making by stakeholders engaged in site remediation, offering insights into international best practices and lessons learned. To this end, the IAEA has conceptualized the Remediation Portal (REMPOR) web application, designed to serve as a global repository of radioactively contaminated sites. The underlying Radioactively Contaminated Sites (RACSI) database integrates data from existing IAEA resources such as the UDEPO (Uranium DEPOsit) and URECSO (Uranium REClaimed SOurces) databases, which focus on uranium mining and processing sites. Leveraging artificial intelligence, the IAEA has enhanced data acquisition and integration into RACSI that will also include information on Nuclear Sites, sites affected by nuclear and radiological accidents, test-sites, and sites affected by NORM (naturally occurring radioactive materials). Currently, the database includes over 500 site entries, with ongoing expansion. A distinguishing feature of REMPOR is its linkage to site-specific remediation technology information, with details on their principles, applicability, effectiveness, and deployment methodologies. Rather than duplicating existing technical descriptions, REMPOR curates and connects users to publicly available remediation technology resources, facilitating informed technology selection. Additionally, REMPOR incorporates eLearning modules and multimedia training materials about the remediation process and remediation technologies to support capacity building among Member States. The extensible database structure for REMPOR has been completed and the web-based interface is under development by the IAEA’s IT infrastructure team. Future enhancements will include satellite imagery to provide visual context, including site conditions, and demographic data for surrounding areas. Coauthors: Horst Monken-Fernandes (IAEA); Mathias Fritz (GEOS Ingenieurgesellschaft); Akira Asahara, Elizabeth Njoroge, and Soraya Benalcazar (IAEA); Felipe Tavares (Geological Survey of Brazil); Mark Mihalasky (IAEA) |
4:35 - 5:00 p.m. | Open Discussion and Closing Remarks |