Crowdworkers Are Not Judges: Rethinking Crowdsourced VignetteStudies as a Risk Assessment Evaluation Technique


Workshop paper


Emma Lurie, Deirdre K. Mulligan

Cite

Cite

APA   Click to copy
Lurie, E., & Mulligan, D. K. Crowdworkers Are Not Judges: Rethinking Crowdsourced VignetteStudies as a Risk Assessment Evaluation Technique.


Chicago/Turabian   Click to copy
Lurie, Emma, and Deirdre K. Mulligan. “Crowdworkers Are Not Judges: Rethinking Crowdsourced VignetteStudies as a Risk Assessment Evaluation Technique” (n.d.).


MLA   Click to copy
Lurie, Emma, and Deirdre K. Mulligan. Crowdworkers Are Not Judges: Rethinking Crowdsourced VignetteStudies as a Risk Assessment Evaluation Technique.


BibTeX   Click to copy

@article{emma-a,
  title = {Crowdworkers Are Not Judges: Rethinking Crowdsourced VignetteStudies as a Risk Assessment Evaluation Technique},
  author = {Lurie, Emma and Mulligan, Deirdre K.}
}

Abstract
Algorithmic risk assessments are widely deployed as judicial decision-support tools in the U.S. criminal justice system. A review of recent research around algorithmic risk assessments reveals a potentially troubling trend: the use of crowdworkers as a stand-in for judges when analyzing the impact of algorithmic risk assessments. We raise three concerns about this approach to understanding algorithms in practice, and call for a reevaluation of whether research should rely on experimental crowdworker studies as a means to assess the impact of algorithmic risk assessments in the criminal justice system.

Share

Tools
Translate to