EVALUATING THE INSTRUCTIONAL DESIGN OF A LARGE-ENROLLMENT SCIENTIFIC COMPUTING WORKSHOP DESIGNED TO BROADEN ACCESS TO GEOPHYSICS
The course’s >1000 questions, were coded to categorize scientific skills and Bloom's taxonomy, and then compared with the facility and discrimination index to assess student learning. Using correlation analysis and machine learning decision tree regression, this approach identified the degree of higher-order thinking and range of skills was lower than expected and likely limited the depth of learning and balanced skill development that could be achieved by learners. Changes were made to increase the number of analysis-level questions and ensure use of key skills to better accomplish learning objectives. Question categorization was also used to study the effect of fading supportive pedagogy elements on students' subject mastery to improve their independence. We accomplished this by progressively reducing prompting of prior information, scaling up Bloom's Taxonomy, and increasing skill requirements. Improvements in 2023 will be assessed for effectiveness.
The granular evaluation framework we developed has successfully detected mismatches between instructional design and learning outcomes that guided our revisions. Our work indicates that an open online format can both broaden access to specialized scientific training and facilitate student learning and skill development.