GSA Connects 2023 Meeting in Pittsburgh, Pennsylvania

Paper No. 102-5
Presentation Time: 8:00 AM-5:30 PM

EVALUATING THE INSTRUCTIONAL DESIGN OF A LARGE-ENROLLMENT SCIENTIFIC COMPUTING WORKSHOP DESIGNED TO BROADEN ACCESS TO GEOPHYSICS


BRUDZINSKI, Michael1, GOLDHAGEN, Gillian1 and HUBENTHAL, Michael2, (1)Department of Geology and Environmental Earth Science, Miami University, 118 Shideler Hall, 250 S. Patterson Ave., Oxford, OH 45056, (2)EarthScope Consortium, 1200 New York Ave NW Ste 400, Washington, DC 20005-3929

The shift towards online instruction creates an opportunity for a more in-depth assessment of student learning and evaluation of instructional design. The Seismology Skill Building Workshop was implemented in response to the pandemic in Summer 2020, and re-offered with minor changes each summer since. The workshop is offered outside of traditional coursework and aims to help advanced undergraduates build scientific computing skills through seismology-specific programming in a Massive Open Online Course (MOOC) format. The >700 enrollees per year and dozens of active learning assignments via a course management system enable a unique analysis of student performance. The workshop is designed to accommodate students without a computational or geophysics background as a pathway into graduate geophysics. The open online format has the potential to address diversity and equity issues in the geosciences as students have access to training opportunities regardless of faculty expertise on their home campus.

The course’s >1000 questions, were coded to categorize scientific skills and Bloom's taxonomy, and then compared with the facility and discrimination index to assess student learning. Using correlation analysis and machine learning decision tree regression, this approach identified the degree of higher-order thinking and range of skills was lower than expected and likely limited the depth of learning and balanced skill development that could be achieved by learners. Changes were made to increase the number of analysis-level questions and ensure use of key skills to better accomplish learning objectives. Question categorization was also used to study the effect of fading supportive pedagogy elements on students' subject mastery to improve their independence. We accomplished this by progressively reducing prompting of prior information, scaling up Bloom's Taxonomy, and increasing skill requirements. Improvements in 2023 will be assessed for effectiveness.

The granular evaluation framework we developed has successfully detected mismatches between instructional design and learning outcomes that guided our revisions. Our work indicates that an open online format can both broaden access to specialized scientific training and facilitate student learning and skill development.