Skip navigation
Please use this identifier to cite or link to this item: http://arks.princeton.edu/ark:/88435/dsp01vm40xr68f
Full metadata record
DC FieldValueLanguage
dc.contributorCooper, Joel-
dc.contributor.advisorOppenheimer, Daniel-
dc.contributor.authorTsai, Paige-
dc.date.accessioned2013-07-19T15:29:25Z-
dc.date.available2013-07-19T15:29:25Z-
dc.date.created2013-04-08-
dc.date.issued2013-07-19-
dc.identifier.urihttp://arks.princeton.edu/ark:/88435/dsp01vm40xr68f-
dc.description.abstractThe present research investigates the reliability of peer assessment methods currently employed by massive online open course platforms (e.g. Coursera). Previous research (Snow et al., 2012) on crowdsourcing has examined the grading of tasks for which there are clear or objective answers (e.g. word sense disambiguation tasks). This study seeks to empirically verify online educators’ claim that the findings from crowdsourcing literature can be generalized to apply to the grading of essays in an academic context. A sample of essays (N = 337) submitted for the Coursera class History of the World since 1300 was analyzed. Part 1 sought to replicate findings that suggest that the aggregate of multiple student grades approximates the judgment of a single expert grader. In Part 2 sought to discover what the primary drivers of peer and expert assessments were, with the ultimate goal of determining whether the relationship observed in part 1 was spurious. Our analyses revealed that while peer grades tend to recapitulate expert grades for the sample of essays, the correlation between peer and expert grades is spurious.en_US
dc.format.extent68 pagesen_US
dc.language.isoen_USen_US
dc.titlePeering into Peer Assessment: An Investigation of the Reliability of Peer Assessment in MOOCsen_US
dc.typePrinceton University Senior Theses-
pu.date.classyear2013en_US
pu.departmentPsychologyen_US
pu.pdf.coverpageSeniorThesisCoverPage-
dc.rights.accessRightsWalk-in Access. This thesis can only be viewed on computer terminals at the <a href=http://mudd.princeton.edu>Mudd Manuscript Library</a>.-
pu.mudd.walkinyes-
Appears in Collections:Psychology, 1930-2020

Files in This Item:
File SizeFormat 
Tsai_paige_thesis.pdf1.07 MBAdobe PDF    Request a copy


Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.