Search This Blog

Wednesday, February 24, 2016

Algorithm Bias

From Slate - 


A Tale of Four Algorithms

Each of these government algorithms is supposed to stop fraud and waste. Which works better—the one aimed at the poor or the rich?

Algorithms don’t just power search results and news feeds, shaping our experience of Google, Facebook, Amazon, Spotify, and Tinder. Algorithms are widely—and largely invisibly—integrated into American political life, policymaking, and program administration.
Algorithms can terminate your Medicaid benefits, exclude you from air travel, purge you from voter rolls, or predict if you are likely to commit a crime in the future. They make decisions about who has access to public services, who undergoes extra scrutiny, and where we target scarce resources.
But are all algorithms created equal? Does the kind of algorithm used by government agencies have anything to do with who it is aimed at?
Bias can enter algorithmic processes through many doors. Discriminatory data collection can mean extra scrutiny for whole communities, creating a feedback cycle of “garbage in, garbage out.” For example, much of the initial data that populated CalGang, an intelligence database used to target and track suspected gang members, was collected by the notorious Community Resources Against Street Hoodlums units of the LAPD, including in the scandal-ridden Rampart division. Algorithms can also mirror and reinforce entrenched cultural assumptions. For example, as Wendy Hui Kyong Chun has written, Googling “Asian + woman” a decade ago turned up more porn sites in the first 10 hits than a search for “pornography.”
http://www.slate.com/articles/technology/future_tense/2016/02/a_close_look_at_four_government_algorithms_designed_to_stop_waste_and_fraud.html?sid=554654ea10defb39638b510d&wpsrc=newsletter_futuretense

No comments:

Post a Comment