2003 Seattle Annual Meeting (November 2–5, 2003)

Paper No. 12
Presentation Time: 4:30 PM

THE DLESE COMMUNITY REVIEW SYSTEM: DIFFERING VIEWPOINTS OF EDUCATORS, LEARNERS AND SPECIALISTS


HOLZMAN, Neil and KASTENS, Kim, Marine Geology and Geophysics, Lamont-Doherty Earth Observatory, 61 Rt. 9W, Palisades, NY 10964, nholzman@ldeo.columbia.edu

The Community Review System is aimed at selecting the “best” resources from the DLESE Broad Collection for inclusion in the DLESE Reviewed Collection. The criteria for admission to the Reviewed Collection are scientific accuracy, pedagogical effectiveness, ease of use for teacher and learner, quality of documentation, importance or significance of content, ability to motivate or inspire learners, and robustness as a digital resource. The Community Review System seeks to identify such resources by combining two types of reviews: (a) reviews delivered via a Web-based recommendation engine from educators in the DLESE community who have taught with the resource or learners who have learned from the resource, and (b) specialist reviews mediated by an Editorial Review Board. The Community Review System is at: http://crs.dlese.org

At this GSA Special session, we will compare and contrast the reviews received from: • Educators who have used DLESE resources in their classroom or other learning context • Learners who have used DLESE resources themselves • Science Content Specialists who have reviewed DLESE resources • Pedagogy Specialists who have reviewed DLESE resources

Some of the questions we intend to consider are: • Are the quantitative scores assigned by these three groups similar? • Do the different reviewer groups spot different flaws or emphasize different issues in their comments? • Does a community-based, web-mediated review procedure add value above and beyond that obtained from the specialist reviews?