Facebook on Friday said it disabled its topic recommendation feature after it mistook black men for “primates” in video at the social network.
A Facebook spokesperson called it a “clearly unacceptable error” and said the recommendation software involved was taken offline.
“We apologise to anyone who may have seen these offensive recommendations,” Facebook said.
“We disabled the entire topic recommendation feature as soon as we realised this was happening so we could investigate the cause and prevent this from happening again.”
Facial recognition software has been criticised by civil rights advocates who point out problems with accuracy, particularly when it comes to people who are not white.
The nasty description is that AI will incorporate all of the flaws and faults of the wider society around it. Because that’s what it does. It looks for patterns. But what is defined as a pattern is what humans see as a pattern. So, if folks find it difficult to distinguish individuals of other races then so will AI. A corollary of this is that if – say – society generally offers black folks lower credit ratings for economic reasons then so will AI.
Lots of people don’t like this.
The fair description is that image recognition isn’t very good yet. Which isn’t really all that much of a surprise. That pattern recognition is one of the things that we humans, as a species, are exceedingly good at. Eagles have more acute eyesight, dogs are able to smell better etc, etc. But among the senses for humans it’s pattern recognition that’s the top trump. That it turns out to be difficult to train a machine to do it ain’t all that much of a surprise.