Back in May last year, the ECJ ruled, in the ‘Google Spain v AEPD and Mario Costeja Gonzalez' case, that Google must remove links relating to an individual user. That user was Mario Costeja Gonzalez, who successfully managed to argue that links to an auction notice from 1995 should be removed as he had moved on in his life, and he had a right to privacy.
Google begrudgingly obeyed, and the rule has been wide-reaching ever since, with Google and other search engines effectively bound, under existing the European Data Protection Directive, to delete information that is deemed “inadequate, irrelevant or no longer relevant.”
This was more commonly referred to as the ‘Right to be forgotten', a key component of the forthcoming EU Data Protection Regulation, and there have been millions of requests submitted by internet users ever since.
Yesterday marked the anniversary of that ruling and recent history suggests that this has been a tricky period for Google, which has not only been plagued with takedown requests, good and bad, but which has continually faced questions over this censorship, transparency and geographical reach (some have called for requests to cover US-based Google.com too).
A debate on censorship was always likely with Google effectively deciding what was right and wrong, and that was only heightened by the firm's own figures – it has approved only 41.3 percent of the nearly 1 million (922,638) URL removal requests received over the last year.
The UK's Information Commissioner Office (ICO) has taken umbrage with Google's censorship too, saying that there are at least 50 cases where the US firm has got it wrong. ICO says it has handled more than 180 complaints from people unhappy that their requests to Google had been turned down, although the regulator concedes that Google got it right in three-quarters of those cases.
"This suggests that, for the most part, Google is getting the balance right between the protection of the individual's privacy and the interests of internet users,” said David Smith, the deputy commissioner of the ICO.
"However, this still leaves a significant number of cases where we believe Google hasn't got it quite right and it has been asked to revise its decision." Smith added that there are still disputes to be resolved, and hinted that the ICO could use its enforcement powers, which include levying a fine, should Google not choose to act.
Censorship is only one part of the problem however, with transparency another topical issue. Yesterday, 80 high-profile academics, include Oxford University's Ian Brown, Cambridge's Julia Powles and Kent's Eerke Boiten, wrote an open letter to the search giant calling for the company to be more transparent, and to release compliance data. They called for Google to be more open in what requests get granted or not, and how it differs on a geographical basis.
“While Google's decisions will seem reasonable enough to most, in the absence of real information about how representative these are, the arguments about the validity and application of the RTBF are impossible to evaluate with rigour,” reads the letter.
“Beyond anecdote, we know very little about what kind and quantity of information is being delisted from search results, what sources are being delisted and on what scale, what kinds of requests fail and in what proportion, and what are Google's guidelines in striking the balance between individual privacy and freedom of expression interests.”
A Google spokesperson told Wired that the company "will consider these ideas, weighing them against the various constraints within which we have to work -- operationally and from a data protection standpoint".
"We launched a section of our Transparency Report on these removals within six months of the ruling because it was important to help the public understand the impact of the ruling," said the spokesperson. "Our Transparency Report is always evolving and it's helpful to have feedback like this so we know what information the public would find useful."
Alexander Hanff, CEO of Belize-based Think Privacy Inc and formerly of Privacy International, told SCMagazineUK.com that Google had arguably done “nothing right”, showing “utter contempt” for the earlier EU ruling and engaging on a European “dog and pony show with the intended effect of misrepresenting the Court ruling in such a way as to sway public and political opinion in order to undermine that ruling.”
“They have focused on wrongly claiming the ruling is an affront to freedom of speech by failing to acknowledge the very sensible public interest caveats which were emphasised by the courts,” said Hanff.
He added that Google should start “behaving like responsible corporate citizens, obey the law and work with regulators to ensure that a framework is put in place which makes it easy for requests covered by the ruling to be made.”
“They can be more transparent about the types of requests being made, how they are being manage, which types of requests are accepted and which are not, and where the requests are coming from - instead of just cherry picking cases which appear to support their spurious arguments against the ruling. These arguments have been put forward by a collection of academics today and they are valid concerns.”
So why are no other search engines receiving the same criticism? “I think the reason others are not getting criticised is because they have not engaged in a campaign to undermine the ruling and are not making a public show of the fact,” said Hanff, although he does encourage other companies to be more transparent on these requests.
There was, however, one ray of light for Google this week on the Right to be Forgotten. Finnish website YLE reports that the company won its first case on the matter, with the country's Data Protection Ombudsman ruling that Google had no duty to remove search results relating to a Finnish man's business mistakes.
The Right to Be Forgotten is a key component of the incoming EU General Data Protection Regulation, although regulators are currently ruling on it under the existing Data Protection Directive (95/46/EC). Article 7 describes when personal data can be processed, Article 12 provides right of blocking, while Article 14 relates to the right to object.