NO to the surveillance State
mass databases, predictive risk, AI
Harms of big data and surveillance
- Diverts social services away from supportive measures for families who need help.
Collecting ever more information about ever larger numbers of children means that important information about children who are at risk of significant harm gets lost. This leads social work away from a supportive people centred model towards compliance with the needs of a standardised system and data capture. Such systems rely on ‘flags’ and risk prediction, leading to inaccurate analysis and to identification of risks where there is none and unnecessary and harmful State intervention.
- Dangers of flagging of more children and young people as being ‘at risk’.
An inevitable consequence of the claim that not being in school is a safeguarding issue will be the flagging within the system of home educated children and children who miss any sessions as being ‘at risk’, thus leading to the increased risks of intervention and all the attendant harms. The flagging of a child as ‘vulnerable’ in any way is a self-fulfilling prophecy. Agencies such as social services and the police treat children and young people differently because of the labels given to them, leading again to higher risk of intervention and to treatment such as child Q and large numbers of other children experience. For children who are already discriminated against the impacts are likely to be higher.
- Lifelong repercussions and consequences of referral
Even just a referral to social services – even though the majority amount to very little – has possible negative impacts. ‘Known to social services’ is a stigmatising label which stays with individuals through adulthood and increases the chance of State intervention. Those, for example, who are care experienced find themselves under considerably increased scrutiny if they become parents and at higher risk of their children being removed by the State.
- Risk of inaccuracy of child protection information.
The data is only as accurate as the person inputting it – it can be highly subjective, can include bias, hearsay, even fabrication – yet once it is included in databases it is treated as though it were objective fact. Children and families might not even know about the inaccurate information sitting on their records.
- Impact on the behaviour of children and young people.
Children and young people are not happy with their personal information being shared and are less likely to seek help or confide in adults if they are worried the information would not remain confidential.
- Children not able to consent to data sharing.
Within the social care system the ability of children to determine what happens to them is limited. Children cannot meaningfully consent nor object to anything that the State says is in their best interests. They aren’t asked about sharing their information nor informed about potential consequences, despite the risks of life altering and negative impacts. We have a responsibility to make sure that the systems we put in place for children do not expose them to unnecessary risk as the Schools Bill does.
“Children merit specific protection with regard to their personal data, as they may be less aware of the risks, consequences and safeguards concerned and their rights in relation to the processing…”
Recital 38 GDRP