Programming For Fairness

In her new book, The Equality Machine: Harnessing Digital Technology for a Brighter, More Inclusive Future, Professor of Law and founding member of the Center for Intellectual Property Law and Markets at the University of San Diego, Dr. Orly Lobel, explores how digital technologies, often maligned for their roles in exacerbating societal ills, can be harnessed to undo the damage they've caused.

Human bias can creep into our algorithms, and an algorithm that is fed data tainted by salary bias is likely to perpetuate that bias itself. Feedback loops are digital vicious cycles that can result in self-fulfilling outcomes. Once again: bias in, bias out. The risk is that an algorithm will learn that certain types or categories of employees are on average underpaid, and then calculate that into salary offers. This is the wrong that recent policy has been designed to eliminate — and that we can program AI to avoid.

What’s more, AI can also help in the future — maybe not even the distant future — by replacing some of the negotiation that takes place in unequal settings. Empirical studies on negotiation differences between men and women have repeatedly shown that women on average negotiate less, and that when they do, employers react negatively. Women don’t ask for higher salaries, better terms, promotions, or opportunities nearly as frequently as men do. In my research, I’ve called this the negotiation deficit.