Paper No. 2
Presentation Time: 9:15 AM


KESSLER, Holger, British Geological Survey, Kingsley Dunham Centre, Keyworth, Nottingham, NG12 5GG, United Kingdom and LARK, Murray, British Geological Survey, Environmental Science Centre, Keyworth, Nottingham, ng125gg, United Kingdom,

The British Geological Survey develops 3D geological models for use by practising geoscientists, who make real decisions based on the model data. It is essential that these decisions account adequately for the uncertainty in the models, but the quantification and representation of this uncertainty is not straightforward. That is because of the disparate sources of uncertainty in a model. In general these sources are (i) the quality and density of input data (standards of interpretation and logging of boreholes, quality of location information etc), (ii) the quality of expert interpretation of data (depending on expertise, local geological complexity) and (iii) the propagation of errors through mechanical steps in model development (triangulation etc).

Following significant engagement with potential users of 3D data, two conditions emerged that were deemed necessary for managing the implications of model uncertainty. First, the data that were originally interpreted during model development should be delivered with the model, so that the user can intuitively assess the inherent uncertainties and decide whether they would have interpreted the data in a similar manner. Second, the methodologies employed should be fully open and transparent. Whilst this approach is viable at the local-scale, quantifying uncertainty remains a significant challenge when delivering outputs from large (national-scale) models that are complex and where the original data are too complex and large to consider as individual elements; a mathematical methodology to estimate uncertainty is still very much required in this scenario and we review some steps toward this.