Skip navigation
Please use this identifier to cite or link to this item: http://arks.princeton.edu/ark:/88435/dsp01p8418q94k
Full metadata record
DC FieldValueLanguage
dc.contributor.advisorMatias, Jorge N-
dc.contributor.authorZong, Jonathan-
dc.date.accessioned2018-08-14T18:34:24Z-
dc.date.available2018-08-14T18:34:24Z-
dc.date.created2018-05-07-
dc.date.issued2018-08-14-
dc.identifier.urihttp://arks.princeton.edu/ark:/88435/dsp01p8418q94k-
dc.description.abstractDebriefing is an essential research ethics procedure in non-consented research wherein participants are informed about their participation in research and provided with controls over their data privacy. This paper presents a novel system for conducting and studying debriefing in large-scale behavioral experiments on online platforms. I designed a debriefing system, with an accompanying evaluation study, which are both delivered as a web application. I recruited 1182 users on Twitter who have been affected by DMCA takedown notices into an empirical study on debriefing. The key contributions of this paper are 1) the design and implementation of the debriefing system, 2) empirical findings from the debriefing study on its unexpectedly low response rate, and 3) an evidence-based analysis of challenges researchers face in recruiting participants for research ethics and data privacy research.en_US
dc.format.mimetypeapplication/pdf-
dc.language.isoenen_US
dc.titleEmpirically Studying Research Ethics with Interface Designs for Debriefing Online Field Experimentsen_US
dc.typePrinceton University Senior Theses-
pu.date.classyear2018en_US
pu.departmentComputer Scienceen_US
pu.pdf.coverpageSeniorThesisCoverPage-
pu.contributor.authorid961039597-
Appears in Collections:Computer Science, 1988-2020

Files in This Item:
File Description SizeFormat 
ZONG-JONATHAN-THESIS.pdf2.03 MBAdobe PDF    Request a copy


Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.