Skip to content

Bet they don’t write this law right

People who share so-called “deepfakes” – explicit images or videos which have been manipulated to look like someone without their consent – will also be criminalised.

Both offences are expected to carry maximum jail terms of two or three years.

Look like someone else” is pretty wide. For example, making deepfake porn, well, by definition because it’s a fake it’ll look like someone else. Even if generic someone else.

So it sorta depends upon the draftsmen – hmm, not looking good, is it?

17 thoughts on “Bet they don’t write this law right”

  1. Nothing wrong with a bit of manipulation in porn, providing she knows what she’s doing.

    And got clean nails, of course.

  2. Oh, they’ll get it Wrong.
    Especially since the whole exercise is not about “protecting women and children” at all.

  3. Also apparently downloading it ???!??

    If I buy a fake Mona Lisa down the market, does that make me a criminal ?

  4. So the next time a video leaks of a cabinet minister balls deep in a twink it’ll be declared “deep fake” and some poor sod is going down for 3 years. OK. Also, women have a right to go about with their tits out but you don’t have a right to photograph them. This includes paparazzi, right? It’s about protecting them from us, isn’t it?

  5. Meanwhile, a plethora of aggrieved stalking victims have gone to law to force the Met Police to enforce the stalking laws they already have on the books but can’t be bothered to enforce.

    As you say elsewhere, welcome to joined up government….

  6. I was going to make a joke about being careful not to Google “Emma Watson” with safesearch switched off but thought I’d check it first. Looks like it’s all been taken care of by EU laws already. That was with a VPN switched on too (admittedly a feeble one).

    Crikey, internet sanitation censorship happened faster than I expected.

  7. Do they have to prove it’s a fake as well? That’s going to be increasingly difficult to do.

    I suppose this is another law passed to say you have a law rather than something intended to work in any reasonable way.

  8. Wasn’t that one of the arguments in the R. Kelly case, that the video circulating of him performing a sexual act on a 14 year old was a fake, despite it clearly being him (allegedly. I haven’t seen it).

    That argument didn’t fly back then, but what if the deep fake technology gets good enough that you can’t tell the difference?

    So some techy with a grievance against, say, Prince Andrew does a deep fake of him schtooping that 17 year old he paid off. Whose going to believe it wasn’t him?

  9. The technology is advancing faster than most people realise. It’s already possible to do face mapping in near real time (a few hundred milliseconds latency) on a fast graphics card as long as the faces are similar. https://github.com/iperov/DeepFaceLive.

    Wind this forward a couple of years and wonder if it’s covered by the “Something Must Be Done Act” they’re proposing here.

  10. So some techy with a grievance against, say, Prince Andrew does a deep fake of him schtooping that 17 year old he paid off. Whose going to believe it wasn’t him?

    Tupping wasn’t the issue because there’s a statute of limitations on statutory rape. It was her claiming to have been “trafficked” that was the killer.

  11. Tupping wasn’t the issue because there’s a statute of limitations on statutory rape.

    Leaving aside the minor technicality that, as I understand it, the gold-digging whore was of legal age here anyway.

  12. Martin,

    “The technology is advancing faster than most people realise. It’s already possible to do face mapping in near real time (a few hundred milliseconds latency) on a fast graphics card as long as the faces are similar.”

    At which point, this law becomes a bit moot, doesn’t it? It gets a bit like the artificial pancreas which you can’t sell, but you can release all the 3D printing templates, circuit diagrams and software and someone can make it for themselves.

    Someone doesn’t need to share the deepfake, just the how-to. “Put together this person with such-and-such a clip”. Maybe it’ll be a good thing as it’ll encourage kids to get a proper gaming PC instead of a crappy console.

  13. @Roué le Jour
    Prince Andrew’s problem is that he is an idiot. His first line of defence should have been to state that he had no recollection of ever having seen that girl, and that he suspected that the photo was probably a fake. Note it’s impossible to prove that a good fake isn’t genuine because any revealing flaws could have been found by an expert and corrected before the photo was released. Add to that, being a “celebrity” there are countless people having photos with him. Far too many to be able to remember more than a tiny fraction, so even if by some chance it was genuine, it’s irrelevant.
    Any questions regarding sexual activity should have been dismissed with a “Of course not. I just told you I have no recollection of ever having seen her”.

Leave a Reply

Your email address will not be published. Required fields are marked *