White House: Big Data Causes Big Disparate Impact
05/04/2014
A+
|
a-
Print Friendly and PDF

The brighter folks within the Obama Administration are starting to figure out what I've been saying for some time: that a lot of the hype over Big Data and apps and the like are businesses trying to get around traditional regulations, including regulations against discrimination. For example, one of the costs that government imposes on licensed taxi drivers is that they are supposed to drive customers wherever they want. But with the new ride sharing apps, drivers can just look up possible gigs offered on their smartphones and say, "Florence and Normandie? Let me Google that ... uh-oh. Re-ject-ed! Ventura and Laurel Canyon? Accepted!"

Back in 1982, my Advanced Marketing Models professor in B-School got to talking about predictive systems used by lenders, insurance companies, and the like. Somebody asked if they really work to identify bad risks. Oh, sure, they really worked, he replied. The problem is that the use of truly powerful predictive factors, like race, have been outlawed and the government is leery of the use of approximate factors like zip code. So they don't work as well as they did a few years ago. This is a quiet way for the white majority to subsidize the black and brown minority in terms of mortgage defaults, insurance rates, etc.
But Big Data to the rescue of Big Business! You don't actually need to know that somebody is black if you know she drinks grape soda, smokes Kools, loves Tyler Perry, etc. All that stuff correlates with being a worse risk (and with being black). John Podesta has just released a White House report on this Menace. From the NYT:
Call for Limits on Web Data of CustomersBy DAVID E. SANGER and STEVE LOHR MAY 1, 2014

But the most significant findings in the report focus on the recognition that data can be used in subtle ways to create forms of discrimination — and to make judgments, sometimes in error, about who is likely to show up at work, pay their mortgage on time or require expensive treatment. The report states that the same technology that is often so useful in predicting places that would be struck by floods or diagnosing hard-to-find illnesses in infants also has “the potential to eclipse longstanding civil rights protections in how personal information is used in housing, credit, employment, health, education and the marketplace.”

The report focuses particularly on “learning algorithms” that are frequently used to determine what kind of online ad to display on someone’s computer screen, or to predict their buying habits when searching for a car or in making travel plans. Those same algorithms can create a digital picture of person, Mr. Podesta noted, that can infer race, gender or sexual orientation, even if that is not the intent of the software.

“The final computer-generated product or decision — used for everything from predicting behavior to denying opportunity — can mask prejudices while maintaining a patina of scientific objectivity,” the report concludes.

Mr. Podesta said the concern — he suggested the federal government might have to update laws — was that those software judgments could affect access to bank loans or job offers. They “may seem like neutral factors,” he said, “but they aren’t so neutral” when put together. The potential problem, he added, is that “you are exacerbating inequality rather than opening up opportunity.”

Stop noticing!
Print Friendly and PDF