Councils are said to be using hundreds of thousands of people’s data to try and predict child abuse, it has emerged.
Five local authorities, Thurrock, Brent, Bristol, Hackney and Newham are accused of using 377,000 people’s data to create an algorithm which would allow social workers to intervene with families perceived of as needing attention from child services.
Among the information gathered are school attendance and exclusion records, housing association repairs and arrears information, and police records on antisocial behaviour and domestic violence, according to The Guardian.
But the Information Commissioners Office (ICO) told The Telegraph it was looking into the practice.
A spokesman said: “All organisations have a duty to look after personal information in their care but records involving children – often sensitive personal data – require particularly robust measures.
“The use of predictive analytics to for child safeguarding is clearly an activity that is likely to have a significant impact on the privacy of individuals.
“We would therefore expect any council using such technology to have fully considered the privacy risks, including conducting a thorough Data Protection Impact Assessment,and to have taken steps to address those risks.
“We will be making further enquiries to ensure that the use of this technology is compliant with data protection law.”
It’s an obvious enough area to be trying to use AI to predict problems. What are the common factors etc? Any warning signs etc?
We could even aid in writing the decision tree.
Has Mummy got a new scrote boyfriend?