Story at a glance
- Uber and Lyft have responded to stories of discrimination against riders by firing drivers and altering the app.
- A new study finds that discrimination persists, however, especially against LGBTQ+ and nonwhite people.
- At the same time, studies show that the pricing on ridesharing apps is also racially biased.
Even after ridesharing apps removed information that indicated a rider’s gender and race from initial ride requests, research shows that bias against nonwhite and LGBTQ+ riders persists.
In a recent study, Jorge Mejia, an assistant professor at Indiana University’s Kelley School of Business, and Chris Parker, assistant professor at American University, randomly manipulated rider names and pictures — which appear after a driver has accepted a ride — on a ridesharing platform in Washington, D.C.
“Our results confirm that bias at the ride request stage has been removed. However, after ride acceptance, racial and LGBT biases are persistent, while we found no evidence of gender biases,” Mejia said in a release. “We show that signaling support for a social cause — in our case, the lesbian, gay, bisexual and transgender community — can also impact service provision. Riders who show support for the LGBT community, regardless of race or gender, also experience significantly higher cancellation rates.”
Riders with names traditionally perceived to be Black were more than twice as likely to have a ride cancelled than riders whose names are traditionally perceived to be white, regardless of gender. Meanwhile, riders whose profile picture was overlaid with a rainbow filter, illustrating support for LGBT rights, were twice as likely as others to have their ride cancelled.
The study also tested whether surge pricing during peak times affected these results. It did for perceived Black riders, who were less likely to be cancelled on when they were paying more, but not for perceived LGBTQ+ riders.
One potential solution Meija and Parker recommended is tracking rider characteristics when a driver cancels and penalizing that driver for biased behavior. But in some cases, the data is the problem. A study published by researchers at George Washington University this June found that Uber and Lyft’s price-determining algorithms discriminate against customers seeking transportation in predominantly non-white neighborhoods.
Analyzing more than 100 million trips in Chicago between November 2018 and December 2019, the study found a higher price per mile for a trip if either the destination or pick-up point had a higher percentage of nonwhite residents, low-income residents or high education residents.
“Unlike traditional taxi services, fare prices for ride-hailing services are dynamic, calculated using both the length of the requested trip as well as the demand for ride-hailing services in the area,” the authors explained. “Uber determines demand for rides using machine learning models, using forecasting based on prior demand to determine which areas drivers will be needed most at a given time. While the use of machine learning to forecast demand may improve ride-hailing applications’ ability to provide services to their riders, machine learning methods have been known to adopt policies that display demographic disparity in online recruitment, online advertisements, and recidivism prediction.”
In response to the study, an Uber spokesperson told Complex the study didn’t consider “relevant factors” such as trip purposes, time of day and land-use or neighborhood patterns. In their conclusion, the authors acknowledged that demand and speed have the highest correlation with ride-hailing fares, but said the result was the same: Fare prices were biased.
In fact, factors such as land-use or neighborhood patterns can be racially biased in their own way, considering how historically racist redlining and development policies have influenced the present-day reality. Black and Hispanic shift workers are also more likely to work late-night shifts than other shifts, according to a 2019 study by the American Public Transportation Association. So to truly combat discrimination in ridesharing apps, lawmakers may have to step in, Meija said.
“Investments in reducing bias may not occur organically, as ridesharing platforms are trying to maximize the number of participants in the platform — they want to attract both riders and drivers,” he said. “As a result, it may be necessary for policymakers to mandate what information can be provided to a driver to ensure an unbiased experience, while maintaining the safety of everyone involved, or to create policies that require ridesharing platforms to monitor and remove drivers based on biased behavior.
changing america copyright.