Skip to main content

Please enter a keyword and click the arrow to search the site

Apparent Algorithmic Discrimination and Real-Time Algorithmic Learning

Subject

Marketing

Publishing details

Social Sciences Research Network

Authors / Editors

Lambrecht A;Tucker C E

Biographies

Publication Year

2020

Abstract

It is worrying to think that algorithms might discriminate against minority groups and reinforce existing inequality. Typically, the worry is that the algorithm's code could reflect bias, or that the data that feeds the algorithm might lead the algorithm to produce uneven outcomes. In this paper, we highlight another reason for why algorithms might appear biased against minority groups which is the length of time algorithms need to learn. If an algorithm has access to less data for particular groups, or accesses this data at differential speeds, it will produce differential outcomes, potentially disadvantaging minority groups. We revisit the context of a classic study which documents that searches on Google for black names were more likely to return ads that highlighted the need for a criminal background check than searches for white names. We show that at least a partial explanation for this finding is that if consumer demand for a piece of information is low, an algorithm accumulates information at a lesser speed and therefore takes longer to learn about consumer responses to the ad. Since black names are less common, the algorithm learns about the quality of the underlying ad more slowly, and as a result an ad, including an undesirable ad, is more likely to persist for searches next to black names even if the algorithm judges the ad to be of low-quality. We replicate this result using the context of religious affiliations and present evidence that ads targeted towards searches for religious groups persists for longer for groups that are less searched for. This suggests that the process of algorithmic learning can lead to differential outcomes across those whose characteristics are more common and those who are rarer in society.

Keywords

Algorithmic Bias; Advertising; Inequality; online advertising; algorithmic learning; digital discrimination

Series Number

3570076

Series

Social Sciences Research Network

Available on ECCH

No


Select up to 4 programmes to compare

Select one more to compare
×
subscribe_image_desktop 5949B9BFE33243D782D1C7A17E3345D0

Sign up to receive our latest news and business thinking direct to your inbox

×

Sign up to receive our latest course information and business thinking

Leave your details above if you would like to receive emails containing the latest thought leadership, invitations to events and news about courses that could enhance your career. If you would prefer not to receive our emails, you can still access the case study by clicking the button below. You can opt-out of receiving our emails at any time by visiting: https://london.edu/my-profile-preferences or by unsubscribing through the link provided in our emails. View our Privacy Policy for more information on your rights.