MINOR AND TRACE ELEMENT ANALYSES USING LASER-INDUCED BREAKDOWN SPECTROSCOPY OF DOPED STANDARDS
For the LIBS spectra of any individual rock matrix, the areas of diagnostic emission peaks increase with concentration, producing useful calibration curves. However, dramatic variations in spectral intensity for minor element peaks occur among different matrices, both in raw data and in spectra normalized to the total number of counts. Sea sand produces the lowest intensity spectra in both cases, perhaps due to poor coupling of the laser due to composition. For all five elements studied, un-normalized ultramafic spectra have nearly double the raw peak intensities relative to the other four matrices, but nearly the lowest normalized intensities. Moreover, there appear to be two distinct calibration curves for each trace element, applicable to those with concentrations <1000 ppm and those with >1000 ppm. These results show that the success of univariate calibrations is highly dependent on having a close match between the matrix and concentrations of the standards and those of the unknown(s), which is challenging in remote applications with diverse rock types.
To mitigate matrix effects and create more broadly-applicable models for predicting minor elements, we used partial least squares multivariate analysis and used cross-validation to quantify prediction errors. Results have reduced error bars (50% more accurate) and produce more generalizable models useful for understanding subtle variations in stratigraphy as well as important differences between geological units.