Auditing Algorithms: Research Methods for Detecting Discrimination on Internet Platforms

Type: 
Lecture
Audience: 
Open to the Public
Building: 
Oktober 6 u. 7
Room: 
226
Monday, November 17, 2014 - 5:00pm
Add to Calendar
Date: 
Monday, November 17, 2014 - 5:00pm to 6:30pm

In this presentation I discuss a project I have undertaken with some collaborators to study the problem of "rigged" algorithms and normatively suspect algorithmic discrimination. We introduced the social scientific audit study, a research design considered to be the most rigorous way to test for discrimination in housing and employment. After outlining some of the challenges of audit studies as they are traditionally done, we proposed the idea of "algorithm audits" as a research strategy that would adapt the social scientific audit methodology to the problem of algorithms. Although other algorithm audit designs are certainly possible, we outlined five idealized designs that empirical research projects investigating algorithms could take, discussing the major advantages and drawbacks of each approach. The five designs were the (1) code audit, (2) noninvasive user audit, (3) scraping audit, (4) sock puppet audit, and (5) collaborative or crowdsourced audit. Writing across these designs we also introduced a number of significant concerns and distinctions between them. While we hope these designs are useful as a guide and agenda for researchers interested in algorithmic discrimination, it is also important to pause at this point and reflect more broadly on the larger context that gives rise to these concerns and these designs.

Dr. Christian Sandvig is an Associate Professor at the University of Michigan and a Faculty Associate of the Berkman Center for Internet & Society at Harvard University. His main areas of interest are new technological infrastructure and public policy, advances in wireless technology the use of the electromagnetic spectrum, and appropriation and user-driven innovation.