Amazon Prime and the racist algorithms

11.05.2016
The revelation in late April that Amazon has excluded minority neighborhoods in Boston, Atlanta, Chicago, Dallas, New York City, and Washington, D.C., from its Prime Free Same-Day Delivery service while extending the service to white neighborhoods probably shocked few minorities. They likely saw it as part of the same pattern of racism that has left many minority neighborhoods with crumbling infrastructure and housing, poor public transportation, little or no access to healthy food, and few places to buy goods.

But to Amazon, and likely others in the tech world, the decision had nothing to do with racism and everything to do with the facts. Amazon argued that it wasn’t acting on prejudice when it excluded those neighborhoods. Instead, algorithms and the underlying data on which those algorithms were based made it clear that Amazon couldn’t make a profit in them. And, given that Amazon is profit-driven, the company excluded them. Race, Amazon said, had nothing to do with it.

Who’s right The answer to that comes down to this question: Can algorithms be racist Yes, they can. I’ll explain.

First, let’s take a look at where Amazon’s Prime Free Same-Day Delivery service is available and where it’s not. An analysis by Bloomberg found that in six major cities, “the service area excludes predominantly black ZIP codes to varying degrees.” Bloomberg concluded that “In Atlanta, Chicago, Dallas, and Washington, cities still struggling to overcome generations of racial segregation and economic inequality, black citizens are about half as likely to live in neighborhoods with access to Amazon same-day delivery as white residents.” In New York City, it found, same-day delivery wasn’t available in the primarily minority Bronx, and “in some majority-black neighborhoods in Queens.”

In Boston, things were even worse. The entire city had access to the service, Bloomberg found, except for three ZIP codes in the primarily black neighborhood of Roxbury.

(Note: Since the report was released, Amazon decided to extend the service to the three Roxbury ZIP codes, as well as to the Bronx.)

Craig Berman, Amazon’s vice president for global communications, told Bloomberg that the original decisions had nothing to do with race and everything to do with algorithms. He wouldn’t reveal exactly what goes into the algorithms, other than saying that included are an area’s concentration of Prime members and proximity to warehouses, as well as the availability of partners who deliver to the area. As for race being a factor, he said: “Demographics play no role in it. Zero.”

That sounds clear-cut: In Amazon’s mind, race has nothing to do with black neighborhoods being excluded, because no racial demographic data was used in its decision-making. But dig a little deeper, and you’ll see that race has everything to do with it. Jovan Scott Lewis, a professor at the University of California, Berkeley’s Haas Institute for a Fair and Inclusive Society, points out that the neighborhoods to which Amazon won’t provide same-day delivery are those that suffered from decades of redlining, a practice of banks that refused to give African-Americans mortgages, even if they were financially qualified for them, because they were black and living in minority-predominant areas.

"The Amazon algorithm operates off of an inherited cartography of previous redlining efforts, which created pockets of discrimination, the consequence being that the discrimination continues to be reproduced,” he told USA Today.

Keith Hollingsworth, chair of the department of business administration at Morehouse College in Atlanta, told the newspaper that in instances such as this, intentional racism may not be at work, but “that doesn’t mean that systemic racism doesn’t affect the outcomes.”

As for minority neighborhoods having few companies that deliver to them and being far from warehouses, that should surprise no one. That, too, is the result of decades of racism — businesses and retailers often won’t operate in minority communities, so there’s no need for warehouses to be near them or delivery companies to service them.

So is the Amazon algorithm racist Yes, it is, although not intentionally. It produced a racist outcome because the data on which it was based was the result of decades of widespread racism. And as Lewis notes, that means that Amazon helps discrimination live on.

The Amazon decision hits poor minority neighborhoods particularly hard, because it’s much more difficult for people in those neighborhoods to buy useful goods at reasonable prices because there are fewer stores and worse public transportation than in other neighborhoods. The irony is that it’s easier for people in wealthy neighborhoods to find low-cost, quality goods than for people in poor and minority neighborhoods, because wealthier neighborhoods are closer to more stores and have better public transportation. So the Amazon same-day delivery service could be of the most help to the very people to whom Amazon won’t provide it.

There’s a well-documented lack of minorities working at tech firms, and that plays into decisions like this. Minorities would have likely noticed the racial component in where Amazon’s same-day service is being excluded and tried to do something about it.

There’s a larger issue as well. The tech industry likes to portray itself as data-driven and progressive, leading us to a better future by a reliance on a clear-eyed analysis of facts, while cutting through blind spots and prejudice. But as the delivery issue shows, the way the industry chooses its data and the algorithms it creates can mean supporting an unfair status quo. That’s exactly what happened here. Amazon should be commended for extending the service to Roxbury and the Bronx. But it should follow that up by extending it to other minority neighborhoods as well. It’s time for the company, and the tech industry as a whole, to do the right thing.

(www.computerworld.com)

Preston Gralla

Zur Startseite