BRINGING TOGETHER DESIGN AND EVALUATION TO UNDERSTAND STUDENT LEARNING
Geoscience departments offer a wide variety of majors and inter-departmental programs reflecting a range of program goals from the development of geologists, geographers, geochemists, biogeoscientists and geophysicists for work in industry, government research and academia to the development of policy makers, environmental consultants, teachers, and sustainability experts. Articulating the goals of a program is the first step toward understanding if students are achieving the desired learning. Performing a SWOT analysis or imagining the ideal student on the day of graduation can initiate a discussion of program goals.
The alignment of program goals with course offerings lies at the heart of designing strong educational programming. Developing a matrix with learning opportunities on one axis and learning goals on another or a curriculum map showing the pathways of students through the curriculum to the awarding of a degree are two techniques that support this alignment.
Strong alignment of goals and programming does not ensure that students are mastering the desired knowledge and skills. This requires an investigation into the response of the students to the programming. Increasingly, all academic institutions are being asked to assess student learning within their programs. Assessment data can answer questions like “Is our well-designed program working as intended?” or “Are the students learning what I think I am teaching?” Departments have taken a wide variety of approaches to answering these questions including exit interviews and examinations, portfolios, and rubrics to evaluate senior level work, field work, and research presentations. Logic models can be used as a tool for designing program evaluation. This poster will highlight resources available on the Building Strong Departments website that support the design and evaluation of undergraduate geoscience programs (http://serc.carleton.edu/depts).