2002 Denver Annual Meeting (October 27-30, 2002)

Paper No. 12
Presentation Time: 4:30 PM


MALEY, Michael P.1, DEMIR, Zafer2, HOFFMAN, Fred2, MANSOOR, Kayyum2 and NOYES, Charles M.2, (1)ETIC Engineering, 2285 Morello Ave, Pleasant Hill, CA 94523, (2)Environmental Restoration Division, Lawrence Livermore National Lab, P.O. Box 808, Livermore, CA 94551, mmaley@eticeng.com

Large environmental remediation projects require numerous decisions regarding technical and regulatory issues. These decisions typically require data interpretation by technical analysts who add their site-specific and professional knowledge to evaluate the significance of these data. For many large sites, simply handling the large database of ground water, chemical and geological data can prove a difficult and time-consuming task. Many sites have invested in geographic information systems (GIS) to address this problem. However, these applications still require input from technical analysts to integrate and interpret the wide range of data to produce results that are useful to decision makers. Therefore, computer algorithms are a cost-effective means to aid analysts in preparing the maps needed by managers to make timely project decisions.

At Lawrence Livermore National Laboratory, the development of a consistent set of ground water elevation, plume, and geology maps that cover the entire project history was essential for implementing and managing the extensive remediation system. These maps were required for each hydrostratigraphic unit (HSU) defined in the site conceptual model. To accomplish this task, a rule-based computer algorithm was developed to create a consistent procedure that integrates multiple data sets both spatially and temporally based on a rigorous set of rules. The resulting maps were reviewed with respect to the site hydrogeological conceptual model to verify accuracy and consistency. Once the procedure was verified, the algorithm allowed for the rapid generation of thousands of dependable site maps covering the entire project history.

The purpose of the algorithm is to support the technical analyst by allowing more time spent on data analysis rather than data compilation. By using this rule-based computer algorithm, LLNL has been able to maintain a strong data analysis support effort for decision-makers even as resources and staff have been re-directed to other aspects of the project.

This work was performed under the auspices of the U.S. Department of Energy by the University of California, Lawrence Livermore National Laboratory under contract No. W 7405-Eng-48.