Consumer advocates are pushing regulators to investigate what they paint as a shadowy online practice where retailers use consumer information collected by data brokers to decide how much to charge individual customers or the quality of service they’ll offer.
#REPRESENT, a public interest group run by the Consumer Education Foundation in California, filed a complaint with the Federal Trade Commission (FTC) on Monday asking the agency to investigate what the group is calling “surveillance scoring" of customers’ financial status or creditworthiness.
Companies are using these data points to decide what prices customers pay, the quality of customer service they receive or whether they can return items, according to the complaint. Some companies are even using data collected about users to decide whether they’ll be approved for housing or offered a job, according to the complaint.
“This is a way for companies to discriminate against users based on income and wealth,” said Laura Antonini, the policy director of the Consumer Education Foundation. “It can range from monetary harm or basic necessities of life that you're not getting.”
Antonini wrote a report along with the consumer advocate Harvey Rosenfield, who leads the foundation, documenting the data practices. The two lawyers argue that the practices they outlined are illegal and consumers are largely unaware that they’re being covertly evaluated in ways that can shape how much they pay online.
The complaint comes as lawmakers are increasingly scrutinizing major technology companies over their handling of user data. Facebook and Google have received the brunt of Washington’s attention because of their massive size and ability to microtarget advertisements based on their users’ behavior.
But #REPRESENT is hoping to shine a light on a part of the world of unregulated data collection that has received relatively little attention and has the potential to enable companies to discriminate against consumers on a massive scale.
“The ability of corporations to target, manipulate and discriminate against Americans is unprecedented and inconsistent with the principles of competition and free markets,” the complaint reads. “Surveillance scoring promotes inequality by empowering companies to decide which consumers they want to do business with and on what terms, weeding out the people who they deem less valuable. Such discrimination is as much a threat to democracy as it is to a free market.”
The complaint highlights four areas in which companies are using surveillance scoring: pricing, customer service, fraud prevention, and housing and employment.
The filing points to a 2014 Northeastern University study exploring the ways that companies like Home Depot and Walmart use consumer data to customize prices for different customers. Rosenfield and Antonini replicated the study using an online tool that compares prices that they’re charged on their own computers with their own data profiles versus the prices charged to a user browsing sites through an anonymized computer server with no data history.
What they found was that Walmart and Home Depot were offering lower prices on a number of products to the anonymous computer. In the search results for “white paint” on Home Depot’s website, Rosenfield and Antonini were seeing higher prices for six of the first 24 items that popped up.
In one example, a five-gallon tub of Glidden premium exterior paint would have cost them $119 compared with $101 for the anonymous computer.
A similar pattern emerged on Walmart’s website. The two lawyers found the site was charging them more on a variety of items compared with the anonymous web tool, including paper towels, highlighters, pens and paint.
One paper towel holder cost $10 less for the blank web user.
Walmart did not respond to a request for comment. Christina Cornell, a spokeswoman for Home Depot, said that the discrepancy in prices can be attributed to whether or not a user has been tagged to a local store.
“The Home Depot does not use consumer scores to determine pricing," Cornell said in a statement. "Sometimes online national pricing can differ from in-store pricing which is influenced by factors like vendor location and shipping costs.”
The complaint also details the industry that has sprouted up to offer retailers evaluations of their customers’ “trustworthiness” to determine whether they are a potential risk for fraudulent returns.
One firm, Sift, offers such evaluations to major companies like Starbucks and Airbnb. Sift boasts on its website that it can tailor “user experiences based on 16,000+ real-time signals — putting good customers in the express lane and stopping bad customers from reaching the checkout.”
The company was not immediately able to offer comment, but a Sift spokeswoman told The Wall Street Journal in April that it rates customers on a scale of 0 to 100, likening it to a credit score for trustworthiness.
But unlike credit scores, there’s no transparency for consumers, and Rosenfield and Antonini argue that companies are using them to engage in illegal discrimination while users have little recourse to correct false information about them or challenge their ratings.
The FTC held a workshop on the practice of what it called predictive scoring in 2014 but has done little to crack down on the practice in the years since. Antonini said that their complaint is pushing the agency to reexamine the industry and investigate whether it violates laws against unfair and deceptive business practices.
“It's far, far worse than when they looked at it in 2014,” she said. “There's an exponentially larger amount of data that's being collected about the American public that's in the hands of data brokers and companies. Their ability to process that data and write algorithms have also improved exponentially.”
Rosenfield said he believes that if regulators were to shine a light on the secretive world of data brokers, it could inspire the type of backlash that has prompted lawmakers around the world to go after Silicon Valley.
“This technological discrimination is in stealth mode at the moment,” he said. But if it comes to light, “I think there will be a public uproar.”
Updated at 2:43 p.m.