Groundbreaking ideas and research for engaged leaders
Rotman Insights Hub | University of Toronto - Rotman School of Management Groundbreaking ideas and research for engaged leaders
Rotman Insights Hub | University of Toronto - Rotman School of Management

Can AI address racial disparity in pricing?

Read time:

Nitin Mehta

From autonomous cars to faster mortgage approvals and automated advertising decisions, AI algorithms promise numerous benefits for businesses and customers alike. Machine learning (ML) algorithms can use vast amounts of consumer data and automate key business decisions such as pricing, product offerings and promotions in real time and at micro-targeted levels.

Unfortunately, these benefits may not be enjoyed equally. Algorithmic bias, which occurs when algorithms produce discriminatory outcomes against certain categories of individuals — typically minorities and women — may worsen existing social inequities, particularly when it comes to race and gender. From the recidivism prediction algorithm used in courts to medical care prediction algorithms used by hospitals, studies have found evidence of algorithmic biases that make racial disparities worse for those impacted, not better.

Large e-tailers such as Amazon and travel websites were early adopters of pricing algorithms, and these tools are now becoming more ubiquitous. For example, Airbnb created a smart-pricing tool based on a machine learning algorithm and offered it for free to all of its hosts. When a host turns on the algorithm, it adjusts the property's nightly rate to optimize revenue based on an evaluation of a rich set of factors — such as the property's characteristics and seasonality — that influence demand for the property.

Pricing algorithms use massive data on consumers and competitors to predict demand and adjust prices in real time without human intervention. Airbnb has far more data and superior computational resources than any individual host, so the pricing algorithm should be more effective than the average host at setting an optimal price for their property. Nevertheless, the algorithm is not guaranteed to benefit hosts, and the opacity of the algorithm makes it difficult to assess.

In a recent paper co-authored with Shunyuan Zhang (Harvard Business School), Param Vir Singh and Kannan Srinivasan (Carnegie-Mellon Tepper School of Business), we sought to determine the extent to which Airbnb hosts have benefitted financially from adopting the algorithm.

Airbnb has received attention of late for the disparity in revenue earned by white and Black hosts. It is plausible that differences other than race (e.g., education and access to other resources) make it more difficult for Black hosts to determine optimal prices. If so, then a well-devised pricing algorithm should serve the needs of Black hosts and help mitigate racial inequalities. The intuition here is that adoption of the algorithm would make it equally easy for white and Black hosts to optimize their nightly rates.

We found that Airbnb could reduce the revenue gap between Black and white hosts by incorporating race into the algorithm.

And if the revenue gap stems from the marketplace itself (e.g. guests are willing to pay more for a property owned by a white host than for an equivalent property owned by a Black host), then that  bias might manifest as both a lower occupancy rate and a lower nightly rate for Black. In other words, Black and white hosts might have different demand curves, and even if the price correction by the algorithm was the same across Black and white hosts, it would have different impacts on their revenues. An algorithm could increase, decrease or maintain the revenue gap.

U.S. law prohibits the explicit use of protected attributes (such as race) in the construction of algorithmic predictions or thresholds. But if the demand curve differs between Black and white hosts, then an algorithm that ignores the host's race would effectively average the demand curves and set the same rental price for equivalent properties owned by Black and white hosts, creating suboptimal conditions for both Black and white hosts. And with Black host underrepresented in the Airbnb data, the algorithm's prices will skew towards the white demand curve, generating prices that are even more suboptimal for Black hosts.

We wondered: Does Airbnb's pricing algorithm lead to similar changes in revenue among Black and white hosts? If not, why?

We found that on average, adoption of the algorithm led to a downward price correction of 5.7 per cent across hosts, which improved their revenues by 8.6 per cent.

While Black and white hosts charged similar prices for equivalent properties (in terms of observed host, property and neighbourhood characteristics), white hosts earned $12.16 more in daily revenue before Airbnb introduced its pricing algorithm. Bookings were 20 per cent lower for Black hosts’ properties than for equivalent white hosts’ properties. The smart-pricing algorithm benefited Black adopters more than white adopters, decreasing the revenue gap by 71.3 per cent.

Algorithm adoption led to a similar magnitude of downward price correction across both white and Black hosts, but it led to a greater increase in the occupancy rate for Black hosts, which explains why Black hosts benefited more from the algorithm. This result also supports our theory that Black and white hosts face different demand curves. The demand for Black hosts’ properties is more responsive to price changes than the demand for equivalent properties owned by white hosts. The algorithm reportedly does not use host race to inform the optimal price, and we found that the algorithm sets similar prices for equivalent properties owned by Black and white hosts.

A race-blind approach to algorithmic decision-making may actually worsen racial disparities.

However, even though the smart-pricing algorithm decreased the revenue gap between white and Black adopters, two challenges remain. First, Black hosts were 41 per cent less likely than white hosts to adopt the algorithm. As a result, the revenue gap between hosts actually increased after the introduction of the algorithm. Second, if Black and white hosts face different demand curves (as our data suggest), then a race-blind algorithm may set prices that are suboptimal for both Black and white hosts, meaning that the revenue of both groups could be improved by the incorporation of race into the algorithm. Moreover, the prices are likely to be more suboptimal for Black hosts because they are less represented in the data that are used to train the algorithm.

We argued that Airbnb could further reduce the revenue gap between Black and white hosts by incorporating race into the algorithm, either directly or indirectly via closely correlated socioeconomic characteristics. Although algorithm adoption was less likely among Black hosts than white hosts in all quartiles, it was the lowest for Black hosts in the uppermost quartile of socioeconomic status. On the other hand, it is only the Black hosts in the bottom three quartiles of socioeconomic status who stand to monetarily gain by adopting the algorithm. Thus, Airbnb may be able to reduce the revenue gap most efficiently by targeting algorithm promotions to Black hosts in the lower quartiles.

Our study has important implications. For policymakers, it shows that when racial biases exist in the marketplace, an algorithm that ignores those biases may have limited effectiveness at reducing racial disparities. Policymakers should consider allowing algorithm designers to incorporate either race or socioeconomic characteristics that correlate with race, provided that the algorithm demonstrates an ability to reduce racial disparities. 

For managers, our results suggest that the revenue gap between Black and white hosts may stem from guests’ racial biases. Although Airbnb cannot overturn a racial bias that is ingrained in society at large, it could try an intervention that prevents guests from knowing the host’s race until they book the property. Finally, managers should devise strategies to encourage algorithm adoption among Black hosts, especially those in the middle and lower socioeconomic quartiles, as they would reap the largest gain in daily revenue. Otherwise, a racial disparity in algorithm usage may end up increasing the economic disparity rather than alleviating it.

Although Black and white hosts seem to face different demand curves, we found that the algorithm set similar nightly rates for similar properties owned by Black and white hosts. And if the demand curve indeed differs by race, such that no property characteristics can explain the disparity, then the algorithm’s exclusion of the host’s race should result in suboptimal prices for both Black and white hosts.

However, because Black hosts are a minority at both the neighbourhood and city levels, the algorithm is built on data that represent the demand curve of white hosts more than the demand curve of Black hosts. The race-blind algorithm therefore sets prices that align more closely with the optimal price of the white demand curve. In other words, although the race-blind algorithm sets prices that are suboptimal for both Black and white hosts, the prices are more suboptimal for Black hosts.

We suggest that Airbnb could further reduce the revenue gap by incorporating the host’s race into the pricing algorithm and join other recent studies in raising awareness of a counterintuitive reality: A race-blind approach to algorithmic decision-making may worsen racial disparities. As indicated, if the revenue gap between Black and white hosts stems from guests’ racial biases, then Airbnb could reduce the gap by preventing guests from knowing the host’s race, perhaps by masking the host’s profile photo until a transaction is made.

Race-masking interventions can have unintended consequences. Research shows that Ban the Box interventions, which are intended to protect minorities against discrimination by employers, actually worsen it. Similarly, if Airbnb guests cannot view the host’s photo, then prospective guests who are biased against Black hosts might avoid neighbourhoods with a higher concentration of Black residents or make stereotyped inferences about the host’s race based on their name, home decor or other available information. 

Firms have begun to put considerable effort into combating algorithmic bias, using data-science driven approaches to investigate what an algorithm’s predictions will look like before launching it into the world. This can include examining different AI model specifications, selecting the input data to be seeded into the model, pre-processing the data and making post-processing model predictions.

Done right, these tools may well mitigate human biases and bridge the economic consequences arising from them. Done wrong, just a few algorithms from established firms could completely undermine AI algorithm deployment.

This article first appeared in the Winter 2022 issue of Rotman Management magazine. Published in January, May and September, each issue features thought-provoking insights and problem-solving tools from leading global researchers and management practitioners. Subscribe Today 


headshot of Nitin MehtaNitin Mehta is a professor of marketing and area coordinator for marketing at the Rotman School of Management. This article has been adapted from his co-authored paper, “Can an AI Algorithm Mitigate Racial Economic Inequality? An Analysis in the Context of Airbnb,” which was recently published in the Frontiers of Marketing Science journal.