2014 GSA Annual Meeting in Vancouver, British Columbia (19–22 October 2014)

Paper No. 288-12
Presentation Time: 11:00 AM


PUNYASENA, Surangi W., Department of Plant Biology, University of Illinois, 505 S. Goodwin Avenue, Urbana, IL 61801, TCHENG, David K., Illinois Informatics Institute, National Center for Supercomputing Applications, University of Illinois, 1205 W. Clark St., Room 1008, Urbana, IL 61801, FOWLKES, Charless C., Department of Computer Science, University of Caifornia, Irvine, CA 92697, MIO, Washington, Department of Mathematics, Florida State University, 1017 Academic Way, Tallahassee, FL 32306-4510 and SHYU, Chi-Ren, Informatics Institute, University of Missouri, Columbia, MO 65211, punyasena@life.illinois.edu

Automated, machine-based pollen analysis holds the potential to not only increase the throughput of palynological analysis, producing more pollen counts per expert per unit of time, but to improve the nature of the data collected as well, by improving the taxonomic resolution of identifications and by providing explicit estimates of identification uncertainty. However, the reality is that the complex task of biological identifications is not one that can easily be mimicked by a machine.

We have been collaboratively exploring the landscape of potential approaches for fully automating pollen identification and have developed a framework for the digital study of fossil pollen material that incorporates automated slide imaging, image analysis, and machine learning. This partnership benefits from a multidisciplinary approach that draws on expertise from computer vision, machine learning, informatics, and applied mathematics. We present three successes: first, an automated imaging and digital annotation that could potentially be replicated by all palynologists; second, examples of learning systems and algorithms that have been applied to the discrimination of morphologically similar species; and third, a prototype of a content-based retrieval database for pollen images that intelligently captures expert knowledge.