Won't someone think of the children? GDPR 'ambiguous' over children

News by Max Metzger

A children's rights expert has called up the GDPR for its lack of clarity over the protection of children's data.

Incoming European data protection regulation has been slammed for its stance on children's rights. Or lack thereof. The General Data Protection Regulation (GDPR) does not do enough to protect children according to John Carr, expert adviser to the European NGO Alliance for Child Safety Online.  

Carr recently wrote an open letter to the EU's data protection supervisor Giovanni Butarelli and the chair of the Article 29 Working Party, Isabelle Falque-Pierrotin, both of whom oversaw the creation of the landmark piece of regulation.

Carr's main concern lies in the GDPR's failure to recognise a difference between the privacy concerns of children and those of adults. Despite children making up one out of every three internet users globally and their data being particularly sensitive, the GDPR does not even define what a child is.

On a much higher level, the UN Convention on the Rights of the Child governs this domain, but Carr told SC Media UK that it “is of very little practical help in steering companies in the right direction with regard to children's data.” The GDPR, added Carr, broadly provides “a good framework for children's right to data protection and privacy although the authorities need to get a move on in terms of spelling out some of the practical details.”

The previous data protection regime, the 1995 Data Protection Directive, had the same absence in discussing the young. The general lack of clarity led to a de facto standard evolving from decisions taken by large social media companies.  Businesses that were likely to target children obtained verifiable parental consent, but others which did not specifically desire children as customers simply set an age limit of around 13, with some variation between European countries.

The final version of the GDPR's Article 8 gives individual states the right to decide between 13 or 16. If they do not specifically decide, then the minimum age automatically becomes 16.

This decision, wrote Carr, “was completely unexpected and it was made without the benefit of any advice or guidance from privacy practitioners. Neither did the politicians who made the decision seek or obtain any advice from anyone within the child welfare or online child protection communities. This was truly a political decision made with a giant capital “p”.

That stance must be reviewed, says Carr, taking into account “the contemporary internet and the services used by children.”

Grooming, for example, becomes an issue when you take into account that few European countries have ages of consent over 16. Therefore, Carr wrote in a recent blogpost, it could be the case that “anyone who visits or uses a site or service after May 2018 will, on the face of it, be entitled to assume everyone they encounter is old enough to engage in sexual activity.”

“Solely automated” profiling, which the GDPR forbides for children, also becomes a problem. If, writes Carr, processes which are not “solely automated” exist, then “presumably they could lawfully be deployed in relation to children even though they produce an identical or very similar result to processes which are solely automated?”

Chiefly however, the regulation is riddled with grey areas when it comes to children. For example, Article 35 mandates that impact assessments be taken out when processing data when there is a “high risk” to the data subject's rights, will children be taken into account as more vulnerable when organisations do that. There seems similarly little recognition of the kind of problems that the compromise of VTech products or the easy exploitation of many connected toys present to children.

The GDPR, in a lot of other areas, is comprehensive. It introduces a wide array of regulations for companies handling personal data within the EU, including breach reporting requirements, pseudonymisation of personal data and mandating that companies employ qualified data protection officers to oversee compliance.  One which many have already found arduous, is the issue of consent.

Those that do not comply could be subject to fines of up to 20 million euros or four percent of global turnover - whichever is higher. New research by global management consultancy, Oliver Wyman, has shown that for large companies, those fines could climb as high as £5 billion.

Find this article useful?

Get more great articles like this in your inbox every lunchtime

Video and interviews