Skip to content

There’s a nasty and a fair description of this

Facebook on Friday said it disabled its topic recommendation feature after it mistook black men for “primates” in video at the social network.

A Facebook spokesperson called it a “clearly unacceptable error” and said the recommendation software involved was taken offline.

“We apologise to anyone who may have seen these offensive recommendations,” Facebook said.

“We disabled the entire topic recommendation feature as soon as we realised this was happening so we could investigate the cause and prevent this from happening again.”

Facial recognition software has been criticised by civil rights advocates who point out problems with accuracy, particularly when it comes to people who are not white.

The nasty description is that AI will incorporate all of the flaws and faults of the wider society around it. Because that’s what it does. It looks for patterns. But what is defined as a pattern is what humans see as a pattern. So, if folks find it difficult to distinguish individuals of other races then so will AI. A corollary of this is that if – say – society generally offers black folks lower credit ratings for economic reasons then so will AI.

Lots of people don’t like this.

The fair description is that image recognition isn’t very good yet. Which isn’t really all that much of a surprise. That pattern recognition is one of the things that we humans, as a species, are exceedingly good at. Eagles have more acute eyesight, dogs are able to smell better etc, etc. But among the senses for humans it’s pattern recognition that’s the top trump. That it turns out to be difficult to train a machine to do it ain’t all that much of a surprise.

13 thoughts on “There’s a nasty and a fair description of this”

  1. Don’t worry about vile, racist AI folks, for even as we speak, African scientists are working to eradicate the problem!

  2. It was accurate. We are all primates. Perhaps the term shouldn’t have been limited to talk about every other primate but us in common vernacular, but there we are.

  3. Yes, I already changed it from “dogs smell better” for exactly that reason. Perhaps further to go? Dogs have a better sense of smell?

  4. What’s odd is all the statistical experts pointing out stuff in the diversity figs of a Microsoft shareholders report, then failing to spot any patterns elsewhere.

  5. What the Genot of Tractors says: We are primates. The problem lies in a lot of people insisting that we somehow are special and outside Mother Nature’s pecking order.

    The AI simply cheated and took the easy way out. Back to the root.

  6. Similar issue with Amazon and a hiring algorithm wasn’t there when it pretty much only selected men for interviews for some positions as it was applying the qualifications and experience rules fairly not skewing them for gender/diversity balance like the HR people were.
    I’ve seen many business complain of the lack of food candidates and had to ask them have they looked at how HR and automated submission systems are filtering out people before the managers even see the applications, especially the degree requirement. I have a professional qualification but no degree which confuses the hell out of the systems.

  7. Theophrastus (2066)

    So AI recognises blacks as a sub-species or breed of homo sapiens. Many of us knew that…

    Race is a real category. As Dawkins puts it:
    “However small the racial partition of the total variation may be, if such racial characteristics as there are highly correlate with other racial characteristics, they are by definition informative, and therefore of taxonomic significance”.

  8. Theo is the mug who supports–ie pays the bar bills of-the Tory Party because they are “a bulwark against Marxism”.

    They’d boot him in 10 seconds if the post above reached them.

    Nothing like paying your hard-earned money to those who uphold your values.

Leave a Reply

Your email address will not be published. Required fields are marked *