A few weeks ago, flying back from London, I found myself in a black hole from which I never emerged. I knew how much I had paid for my seat, how many miles I had spent to upgrade. But I had no idea whether the woman across the aisle had spent just a few points, like me, or paid the $10,000 the airline could charge for the same trip. Booking a flight has long been a game that only the airline knows the rules of, with endless booking codes, loyalty programs, and fare changes that use your data as a weapon against your wallet. But after I landed, I continued to see the same rigged game everywhere: in every Uber ride, every Amazon order, every trip to the grocery store. All these companies now know so much about me that they can see a number flashing above my head: the exact price I would be willing to pay at any given moment. Your own number is flashing above your head right now.
In the age of algorithms, price variability is increasingly creeping into digital commerce, with fees rising and falling in real time.
Far more concerning is the rise of personalized pricing, a practice by online retailers that uses your own data to charge the exact price you’re willing to pay, which may be different from what your neighbor would pay. Personalized pricing not only fuels bias and can fuel inflation, it also creates a world where you never know when your apps are ripping you off.
Now, when I’m about to pay for anything on my phone or laptop, I wonder if I would pay less if I used someone else’s account.
I still remember the shock I felt a decade ago when I learned that price discrimination was often perfectly legal in the United States. In law school, my antitrust professor introduced us to the obscure Robinson-Patman antidiscrimination law from the Great Depression era, quickly pointing out that the law didn’t live up to its title. Under this long-standing law, companies can face ruinous penalties for price discrimination only if they discriminate against other businesses. If a wholesaler overcharged a store, the store could sue, but there was nothing then (or now) to stop the store from doing the same thing to its customers. In other words, store owners enjoy better price protections than their customers. If a store routinely charges some customers more than others because of their gender, race, or other legally protected characteristics, it’s almost certainly illegal. But when businesses want to squeeze the maximum out of each customer that they are willing to pay individually, they are free to engage in armed robbery.
I call it a low-level shock because at the time, personalized price discrimination was much less widespread and less harmful than it is today. Sure, coupon culture allowed companies to sell the same product in the same store at the same time at different prices, but it gave customers some freedom of choice. Price-conscious shoppers took the time to shop around for second-hand items, and less thrifty shoppers paid full price. Coupons, loyalty cards, seasonal discounts: many traditional price discrimination practices allow shoppers to individually choose the price category they want to be in.
But algorithmic price discrimination takes away that choice. And the data mining methods used to sort people into price groups are more invasive than you might think. Take your last Uber ride. When you ordered that car, you probably knew that the distance you were going to travel and the time of day were factors in your price, because we’ve become reluctantly accustomed to the cold, extractive efficiency of dynamic pricing. But did you remember to plug in your phone before you ordered the ride? If you did, you might have saved a few bucks, because your battery level is reportedly one of the factors Uber uses to set the price of your ride, an accusation Uber vigorously denies. If the allegations against Uber are true, it’s easy to see why: Those with less battery are more desperate, and those whose phones are minutes away from dying will pay almost any price to get a ride before they get stranded.
As The American Prospect recently explained, this type of individualized pricing is spreading across nearly every sector of the economy—streaming, fast food, even dating apps—and it can be surprising what variables will make you pay more. In the 2010s, retailers relied on somewhat crude data to perfect their pricing. Customers might have paid more for a flight booked on a Mac (versus a PC) or paid a higher rate for a test prep in zip codes with larger Asian communities. But in recent years, companies have shifted from neighborhood-level price discrimination to individualized pricing.
Retailers like Amazon know a lot about what you buy, whether it’s on its platform or off. And you have no way of knowing when your choices change what you pay. In 2018, headlines reported that Amazon was adjusting its prices 2.5 million times a day. Given Amazon’s growth and the growth of AI, that number is likely much higher today. For retailers like Walmart, it’s not enough to just use our purchase history. In February, the retail giant agreed to buy smart TV maker Vizio for more than $2 billion, potentially giving Walmart a wealth of personal consumer data. Smart TVs not only monitor what we watch with Orwellian precision, they also track other devices nearby with ultrasonic beacons and can even listen in on what we say in the privacy of our own homes. Vizio was notably fined millions of dollars over allegations it illegally spied on its customers.
Retailers not only know what you bought and how much money you make, they often know where you are, how your day is going, and what your mood is—information that can be carefully synthesized by AI neural networks to calculate how much you would pay for a given item at a given moment.
Your age, gender, and sexual orientation can determine what the AI decides you should pay for love.
No business is too personal to be banned. Dating apps harvest data about our love lives, but some openly brag about doing so to increase their profitability. And many that don’t disclose that they use personalized pricing do so anyway. Tinder rarely talks about its pricing technology, but Mozilla and Consumers International recently discovered that the dating app used dozens of variables to dramatically adjust prices for users. Your age, gender, and sexual orientation can determine what the AI decides you should pay for love.
If left unchecked, personalized pricing will have pernicious effects on society. Nikolas Guggenberger, an assistant professor at the University of Houston Law Center, argues that “hidden algorithmic price discrimination can undermine public trust in pricing mechanisms and thus harm the market.” AI-driven pricing also means that the most desperate and vulnerable will often pay the most. Worse, people could be penalized based on their race, age, or class. Take the phone battery claim. Older people are twice as likely as younger users to have phones that are at least three years old. Since older smartphones tend to have shorter battery lives, older people could end up paying more than younger people for the same Uber rides.
“Algorithmic price discrimination can virtually automate wear and tear,” Guggenberger says. “If your battery is dying and you’re in the countryside, a ride-hailing app can dramatically increase your ‘personalized price.’”
AI pricing acts largely as a regressive tax, making those who have the most pay the least. For people living in underserved areas, with fewer stores and fewer alternatives, there’s often no choice but to click “buy now,” even if it hurts. As Zephyr Teachout, a law professor and consumer advocate, told The American Prospect, we shouldn’t think of this practice as something as benign as personalized pricing—she calls it surveillance pricing instead.
We know how to prove human discrimination. If a store in a majority-black neighborhood charges more than its counterpart in a majority-white neighborhood, testers can go to each store, record the prices, and sue. This kind of testing has been at the heart of consumer protection for nearly a century. But how do you prove that an algorithm discriminates? There are no stores to visit, no price tags to compare, just millions of screens siloed in people’s pockets. The result can be a trap, where you can get enough data to prove discrimination just by suing a company, but you can’t sue the company without first having the data. We could see a perverse and bizarre legal world emerge where companies that use biased AI to adjust prices in secret face less legal scrutiny than brick-and-mortar stores.
I hope this situation is so bleak, the potential for abuse so obvious, that even our dysfunctional democracy won’t accept it. Our lawmakers have been so slow to limit the harms of new technologies, even when it has become clear, for example, that they are undermining our democracy. But even in these polarized times, AI pickpocketing may be one of those rare issues that can unite us in outrage.
Albert Fox Cahn is the founder and executive director of the Surveillance Technology Oversight Project, or STOP, a civil rights and privacy advocacy group based in New York.