In July 2019, Cori Crider, a lawyer, investigator and activist, was introduced to a former Facebook employee whose work monitoring graphic content on the world’s largest social media platform had left deep psychological scars. As the moderator described the fallout of spending each day watching gruesome footage, Crider was first struck by the depth of their pain, and then by a creeping sense of recognition.

After a 15-year career defending detainees of Guantanamo Bay, Crider had learned the hallmarks of post-traumatic stress disorder. But unlike Crider’s previous clients, the moderator had not been tortured, extradited or detained. They had simply watched videos to decide if they were appropriate for public consumption.

Well, yes, OK.

The people who have to watch what the rest of us can’t see might well suffer from having done so. An argument against the death penalty is what it does to those who must execute. Lord knows what the sex lives of film censors are like.

So, having diagnosed this problem what should we do?

A month earlier, Crider co-founded Foxglove, now a four-woman team of lawyers, community activists and tech experts dedicated to fighting for “tech justice”. It wages legal battles against the increasing use of opaque and discriminatory algorithms in government decision-making; the spread of harmful technologies, such as facial recognition software; and the vast accumulation of power by tech giants.

Campaigning against the use of AI doesn’t quite sound like the logical solution.

Leave a Reply

Your email address will not be published. Required fields are marked *