CALL FOR PROPOSALS:

ORGANIZERS

  • Harvey Thorleifson, Chair
    Minnesota Geological Survey
  • Carrie Jennings, Vice Chair
    Minnesota Geological Survey
  • David Bush, Technical Program Chair
    University of West Georgia
  • Jim Miller, Field Trip Chair
    University of Minnesota Duluth
  • Curtis M. Hudak, Sponsorship Chair
    Foth Infrastructure & Environment, LLC

 

Paper No. 4
Presentation Time: 2:15 PM

WHAT DO YOU SEE? USING VIDEO FOR PRE/POST ASSESSMENT OF INTERACTIVE ENGAGEMENT


GILLEY, Brett Hollis, Earth and Ocean Sciences, University of British Columbia, Room 2020, Earth Sciences Building, 2207 Main Mall, Vancouver, BC V6T 1Z4, Canada, bgilley@eos.ubc.ca

Research on expertise suggests that videos are a useful way to distinguish between experts and novices. This study uses student responses to video to assess an interactive engagement lesson on landslides in a large enrolment course on natural disasters. Using authentic assessment is difficult, because the logistics involved in taking 120 students into the field before and after a lesson is prohibitive and timing it to meet events of significance is all but impossible. Videos of landslides, many publicly available, may not be the same as observing events in the field, but allow students to watch more than once in order to notice important features. They also extend assessment possibilities beyond student response to static diagrams or images.

We used a pre/post test with one open ended question and two similar landslide videos. One video was shown at the start of class, and the second was shown at the end of class after the lessons on landslides were completed. Because of the dramatic nature of the videos each one was played twice before the students answered the question “What important things do you notice in this video?” We hypothesized that the interactive engagement within the lessons would affect what our students noticed in the videos. Students reported a wide variety of features in the pretest, many of which would be considered only obliquely relevant (or irrelevant) by experts. Few students, only 11%, commented on how the slide moved. In contrast, in the post test 58% of students noticed and correctly identified the type of slide motion and as a whole identified a much narrower and more expert-like range of features. Student performance between the pre and post tests improved greatly and assessments based on videos allowed us to distinguish this difference.

Meeting Home page GSA Home Page