It might not be the algos you know:
I’ll illustrate with an anecdote from my Facebook days. Someone on the data science team had cooked up a new tool that recommended Facebook Pages users should like. And what did this tool start spitting out? Every ethnic stereotype you can imagine. We killed the tool when it recommended then president Obama if a user had “liked” rapper Jay Z. While that was a statistical fact – people who liked Jay Z were more likely to like Obama – it was one of the statistical truths Facebook couldn’t be seen espousing.
I disagreed. Jay Z is a millionaire music tycoon, so what if we associate him with the president? In our current world, there’s a long list of Truths That Cannot Be Stated Publicly, even though there’s plenty of data suggesting their correctness, and this was one of them.
African Americans living in postal codes with depressed incomes likely do respond disproportionately to ads for usurious “payday” loans. Hispanics between the ages of 18 and 25 probably do engage with ads singing the charms and advantages of military service.
Why should those examples of targeting be viewed as any less ethical than, say, ads selling $100 Lululemon yoga pants targeting thirtysomething women in affluent postal codes like San Francisco’s Marina district?
It’s not the algos that are racist/sexist/classist. It’s us.