Aren’t we ruled so well?

Algorithms used to scan online messages for images of child sexual abuse may become illegal next week, after EU officials failed to agree on legal exemptions for the technology before new privacy protections come into effect.

Sigh.

This at the same time as criminal liability is to be placed upon execs if they don;t catch the same images and activity…..

20 thoughts on “Aren’t we ruled so well?”

  1. So Much For Subtlety

    I expect Boris is trying as hard as he can to sell us out on Brexit.

    But clearly there is only one solution if the algorithm does not work. People will have to search it out, ummm, manually by looking at every porn-related image on the internet.

    It will be a difficult and disgusting job. A morally corrupting one. In the interests of saving my more junior British men, I will, reluctantly, volunteer.

  2. The idea is snooping evil shite anyway. HTF can some program decide what is and isn’t child abuse.? You can give it lots of actual images to compare–but when it is sicced on the Net how many innocent people will get hassled because of what the machine “thinks”?

    If memory serves a couple of people have been croaked by robot “surgeons” because the things have zero powers of improv/adaption away from standard circs.

  3. Johnson would LOVE to sell us out but his EU mates are giving NOTHING-worth having anyway–and want not only to control us but to humiliate both UK and Blojob to Encourage les Autres so to speak.

    If Bloj gives literally every mans hand will be raised against him. Even those –ever fewer –morons Tweeting “Good Od Boris” shite. They seem to be mostly dimmer Redwall Tory Brexiteers who still haven’t seen through Blohard since last Dec. There aren’t enough to encourage the Fat Bastard much.

    If he gives he is gone–even using the “recall and 2 days to read a 1000 pages of trap-filled EU bullshite” ploy. If he recalls the bastards they might well demand a vote on his Tier 4 cockrot as a reward for fucking their nice Christmas’s up.

  4. There used to be a program that flagged stuff up as porn based on how much flesh coloured stuff was in the image. Some of our firm’s promotional material got blocked by it. The offending image was of a guy’s forearm holding a power tool.

  5. Stony: a guy’s forearm holding a power tool

    Sounds fair enough then. I didn’t realise that Rocco Siffredi Ltd sent out fliers just like double-glazing companies.

  6. It’s rather two separate things isn’t it?

    Everybody + dog in the field knows that testing images with algorythms ( or “AI” , or…) gives extremely lackluster results with lots of false positives, and an “impressive” percentage of detection failures.
    And that’s before differences in local law what constitutes child (sexual) abuse, and some other legal niggles, like territoriality.
    The technology is simply too immature to have practical applications at the moment, and inflicting blanket screening with an immature and inaccurate method on the populace in general by a commercial entity that does not have a state mandate is, at least in my opinion, a Really Bad Idea.

    And honestly.. You must be one of the world’s biggest idiots if you share stuff like that on any major social media outlet.. The real Bad Guys most certainly don’t. So all that’s left is some nannying over some family pics posted by gran.

    Then there’s the criminal liability for execs who fail to stop such Nasty Things. Which they are anyway if aware under current law.
    So far Social Media™ got away with using fairly useless technology using methods not even the secret/security services are (officially) allowed to use, while hiding themselves behind a “we’re not a publisher, the law here sez so” legality.
    Setting up proper, effective measures against shenanigans, with an actual option for redress where things are flagged wrongly ( no chance in hell at the moment..) is expensive and hurts the bottom line. Which is why it isn’t there.

    Putting execs’ arses on the line is a rather effective way of making sure that changes.
    Even within the protections of the corporate structure there’s such a thing as gross negligence and other principles that allow you to hold individuals at any level legally responsible for cock-ups. All that’s needed is to make sure these rules apply, and get enforced.

  7. “The technology is simply too immature to have practical applications at the moment”
    It would be used under supervision, with human verification of the positives and with the false negatives ignored except for a bit of statistics captured to try to improve the algorithms. Probably still lots better than having humans do every single damned image.
    [Yes, some applications are used unsupervised. Hopefully not the really important ones.]

  8. “Probably still lots better than having humans do every single damned image.”

    Ummm.. No Need to?
    There’s a report/flag function on every major social media application.
    Only consider the flagged images and use a weighting to filter out input from Habitual Flaggers/Activists/Idiots in combination with a treshold amount.
    Congratulations, you just reduced the workload by 99.99% while not having to pry in Other People’s Business.

    Then, considering the (alledged) origin of the poster, you might want to send the pic to a localised center to catch any local legalities (unsurprisingly, neopuritan wokeïtis has so far met with firm resistance outside of the US/UK, deal with it.), using actual eyeballs.

    See? Not that hard…

  9. @Grikath

    Re “some nannying over some family pics posted by gran”, surprises me how many people find it appropriate to show pics of their young kids swimming nude or even in the bath! On the one hand I can see why they find it sweet. On the other hand, there are definitely people who would view them in a different light! And that’s before we get on to the question of what the kids think, or will think later in life, about their own privacy in all of this…

  10. @MBE That may be because a surprising amount of people do not give a toss about a kid’s nudity, and most certainly 99.99% do not have any erotic thoughts about toddlers in a kiddie pool.
    And most people don’t share pics like that as “public” to begin with, except possibly those who shouldn’t be near a social media account, and probably a computer, to begin with.

    As for later embarassment.. By the time kids get old enough to notice and protest those pics will be buried so deep it would take a conscious effort to find them online.
    The chance of them cropping up at a significant birthday, wedding/bachelor party, or in the hands of the Potential Significant Other is much higher, and usually in fancy dead tree format from the start anyway.
    And we all learned to live with those occasions, so let them as well. It builds character..

  11. Just for information, the last time I visited Ely, Richard “the dick” Murphy was very excited about a catalogue from Screwfix. I am trying to work out a new series title, along the lines of Richard’s Screwfix no 1

  12. Just for information, the last time I visited Ely, Richard “the dick” Murphy was very excited about a catalogue from Screwfix. I am trying to work out a new series title, along the lines of Richard’s Screwfix no 1

    Surely you can slip his “Crosshead” (geddit?) in somewhere?

  13. My nephew sent an email with pictures of his newborn daughter (Dec 18 2020). I advised him to hold on to the one of full frontal nudity to use as an enticement for better behavior when she reaches her teens. Should that picture have been blocked?

  14. Grikath,

    “Everybody + dog in the field knows that testing images with algorythms ( or “AI” , or…) gives extremely lackluster results with lots of false positives, and an “impressive” percentage of detection failures.
    And that’s before differences in local law what constitutes child (sexual) abuse, and some other legal niggles, like territoriality.
    The technology is simply too immature to have practical applications at the moment, and inflicting blanket screening with an immature and inaccurate method on the populace in general by a commercial entity that does not have a state mandate is, at least in my opinion, a Really Bad Idea.”

    And what anyone thinks is a child. A pediatrician testified that a 19 year old in an adult movie was a child. The girl heroically flew to the court room, produced Id and saved the guy.

    https://en.wikipedia.org/wiki/Lupe_Fuentes#Early_life_and_adult_film_career

Leave a Reply

Your email address will not be published. Required fields are marked *