2004 Denver Annual Meeting (November 7–10, 2004)

Paper No. 12
Presentation Time: 11:20 AM

ASSESSING THE EFFECTIVENESS OF ELECTRONIC STUDENT RESPONSE TECHNOLOGY


HEANEY, Peter J., Dept. of Geosciences, Pennsylvania State Univ, 309 Deike Bldg, University Park, PA 16802 and GREER, Lisa, Geology, Washington and Lee University, Lexington, VA 24450, heaney@geosc.psu.edu

Over 5 semesters since Spring 2002, we have employed electronic student response technology (ESRT) in Geosc 20: Planet Earth, a general education course for non-majors at Penn State University. Our assessment protocol for this new technology included a variety of methods: Attendance data; Quantitative student survey data; Qualitative visitor feedback; and Qualitative input from Geosc 20 students. By virtually all of these measures, the system is a success. For example, by our third semester of implementation, about 80% of students agreed that the technology helped them learn, and over 90% believed that ESRT should be used in future classes of Geosc 20. Written responses from students consistently praise the system as a highlight of the course.

But are these adequate measures of the effectiveness of this novel pedagogical tool? Our 21-question student surveys unambiguously revealed that ESRT enhances student enjoyment of Geosc 20, but they cannot ascertain the true improvement in students’ abilities to answer questions, beyond the students’ impressions of their own progress An increase in student satisfaction alone may warrant the continuation of this system, but a quantitative assessment of the actual increase in student capacity for scientific reasoning obviously is desirable.

This contribution is intended to provoke a discussion of the best methods for assessing ESRT in geoscience courses. Approaches that merit consideration would include: 1) Single-instructor comparisons with and without ESRT. Although such single-variable comparisons often appear as the best approach in theory, unavoidable confounding factors can limit their viability in practice. 2) Before- and after-course testing of students to gain specific insights into skill improvement. 3) Large-scale comparisons of student survey data among multiple instructors. The weighted responses of the student survey data for the two authors had correlation factors of 0.028 and 0.006 (where perfect 1:1 correspondence yields a factor of 0) for their first and second semesters of ESRT usage. Consequently, in our experience, ESRT elicited similar responses despite differences in teaching style.