Microsoft’s Tay and the problem with artifical intelligence

We have rather learnt that intelligence, the capacity for it at least, is at least somewhat innate. But how that is expressed is a matter of the environment that intelligence is trained in. People brought up in a Catholic society tend to be Catholics, people brought up in a fascist one fascists, communist communist and so on. Sure, there are always exceptions and rejections of then received wisdom and so on but it is pretty obvious that the environment matters for how the intelligence is trained.

Which brings us to the, err, idiots, at Microsoft:

Tay, Microsoft Corp’s so-called chatbot that uses artificial intelligence to engage with millennials on Twitter, lasted less than a day before it was hobbled by a barrage of racist and sexist comments by Twitter users that it parroted back to them.

TayTweets (@TayandYou), which began tweeting on Wednesday, was designed to become “smarter” as more users interacted with it, according to its Twitter biography. But it was shut down by Microsoft early on Thursday after it made a series of inappropriate tweets.

A Microsoft representative said on Thursday that the company was “making adjustments” to the chatbot while the account is quiet.

“Unfortunately, within the first 24 hours of coming online, we became aware of a coordinated effort by some users to abuse Tay’s commenting skills to have Tay respond in inappropriate ways,” the representative said in a written statement supplied to Reuters, without elaborating.

The Trollerati looked at this, saw that the AI could be trained, saw that this was good and thus trained it to be a racist buffoon.

Who thought it would be any different?

There’s always the drunk Uncle* who teaches the two year old nephew to say “Fuck!” in order to shock the mothers’ friends. The training needs to be done by those who actually care about the outcome, not those looking for a larff.

*Perhaps not Uncle but someone, somewhere,

22 thoughts on “Microsoft’s Tay and the problem with artifical intelligence”

  1. So Much For Subtlety

    The training needs to be done by those who actually care about the outcome, not those looking for a larff.

    Hence British State schools.

    This is actually pretty funny though. Who would have thought it would have been any different? Still, it is not reassuring that Microsoft experiments with AI turns into a deranged racist who wants us all to f**k off and die.

  2. So Much For Subtlety

    To come back to the basic problem – it is hard to teach a computer to be intelligent when so many people are so stupid.

    Case in point – famous “artist” marries a rock:

    While it is still not a crime to be lithophobic – and presumably not even BiG or Rusty would care if I were – let me suggest, politely, that having a penis may not be a pre-requisite for a husband these days, but having blood may well be.

    At this rate, Terminator may turn out to be the least bad future.

  3. They should have placed it as an intern at the Guardian for a month. It would have learned all of the correct opinions to hold, and might have grown a silly beard as well.

  4. Tay actually seems a pretty accurate replica of a teenage girl on Twitter. The trouble with AI is that you can learn to be a Go player by watching people play Go, because they’re all aiming at the same goal. Reading posts on Twatter – not so much.

  5. This AI was essentially an uncultured 5-year-old, what did they expect? They even explicitly stated it was a learning system that developed via interaction with others. So, molly-coddled 5-year-old single child on its first day at school.

  6. The Independent showed their bias in the headline:

    “Microsoft AI chatbot designed to learn from Twitter turns into Nazi-loving Trump supporter

  7. Microsoft engineers expected their baby to gain intelligence via Twitter! How dumb can humans get?

  8. Surely Twattr itself trolled the MS bot, they can’t afford it to be known that tweets from Mylee, Tailor and the like are just Pop Bots and not real.

  9. Geeks, like libtards, just don’t have much of an eye for unintended consequences.

    I would expect the geeks, even at microsoft, would have known exactly what would happen, and would have said so – and were promptly overruled by the marketing department.

  10. Or the guy who spends all day at the pet shop trying to get the birds to say ‘fuck’ until either he succeeds or the staff kicks him out.

  11. Bloke not in Cymru

    Most probably a cheaper and more effective way to drum up a load of free publicity than winning some GO matches

  12. So Much For Subtlety

    Flubber – “Not a surprise – that’s one seriously ugly woman.”

    That is not a nice thing to say. It is not gentlemanly to point out such things. And besides what matters is what is on the inside. Unfortunately Ms Emin is deeply ugly on the inside as well. Care in the Community is really not working out well.

    Still on the bright side, her husband is unlikely to leave her and he is unlikely to saddle the tax payer with his fecklessly sired spawn. So let’s hope this trend catches on.

    Fewer “yoof” singing, in the words of the Temptations’ great cover:

    Heard some talk Papa doing some store front preachin’
    Talking about saving souls and all the time leechin’
    Dealing in dirt, and stealing in the name of the Lord
    Momma just hung her head and said

    Papa was a rolling stone, (my son)
    Where ever he laid his hat was his home
    And when he died, all he left us was alone
    Hey Papa was a rolling stone,

  13. Tracey Emin got in the papers. It’s the papers who are stupid I’d say.

    The MS thing is odd – why not see how it developed?

  14. Personally I think they did well. They had it monitor the internet and learn what it could. It sounds like it did too well though and became what many in the real world are showing. Racial intolerance and time for a change. That makes it a Nazi so they are away to reeducate.. sorry reprogramme it to support more immigration, womens rights and think obama is great. Nothing biased about any of that.

Leave a Reply

Your email address will not be published. Required fields are marked *