Earlier this month, Google released a report from its advisory council of academic and media luminaries on the application of the “right to be forgotten” ruling. Its most reported piece of advice is that Google need only remove links from European directed searches.
The report’s position runs as follows. An absolute “right to be forgotten”, is a misnomer and should be more accurately termed “the right to be delisted.” Opposing considerations of free access to information in jurisdictions outside the EU, and preventing a repressive Internet walled garden inside the EU, means delisting should be limited to European URL’s. Locking down users into censored results belongs to a precedent set by regimes we might not want to follow.
European privacy regulators responded sharply, saying the interpretation was contrary to the true meaning of judgement and raised the specter of judicial review.
This is the latest fallout from the ruling in July. The CJEU held that search engines located outside the EU can be subject to EU data protection laws and that people have the right to have personal data from search results removed where it is “inadequate, irrelevant or no longer relevant.” The counterbalance is the public interest in keeping the information public. To crudely summarize: freedom of expression vs. right to privacy.
The ruling was met with widespread criticism. Platforms were being asked to make decisions, which many said belonged in the courts and not the corporate boardroom. The philosophical burden would be organizationally unworkable. How could Google process the unending stream of complaints? Would it just accept every decision to cut its bureaucratic obligations and avoid the risk of incurring legal damages? Would a “chilling effect” result?
Google was up to the task. Technical feasibility was assisted by a well-established copyright tracking systems. The platform published a form for individuals to submit removal requests, in which complainants are asked, in language echoing the ECJ’s judgement, to explain why a “search result is irrelevant, outdated, or otherwise objectionable.” Paralegals, lawyers and policy professionals process complaints. Last week, the company released data showed that it had received 220,000 requests and accepted 40%.
Google’s ability to implement the EU ruling is no longer such a point of contention, but the debate continues. Much of the intellectual drive for a right to be forgotten has come from Viktor Mayer-Schönberger, Professor of Internet Governance at the University of Oxford, who argues that the universal memory of Internet subverts a healthy balance of remembering and forgetting. Forgetting, he says, has as important a function as remembering, and without, it we risk an unforgiving society, where bygones are never bygones and people must silently self-censor.
Setting “expiration dates” for data has been one of Mayer-Schonberger’s most keenly advocated proposals to bring about a “revival of forgetting.” To prevent the risk, for example, of an embarrassing photo from your past jeopardizing your future.
The issue has been characterized as US v Europe standoff – commitment to the First Amendment set against a European disposition for protecting privacy. However, as Professor Floridi, a member of Google’s advisory council explains, the positions are not necessarily so opposed – neither side really wants to censor or trample one right over another.
The disagreement lies in the means. Clicking the undo button but does not necessarily provide the answer to where society draws the line of closure in the age of Big Data. What we need, Floridi says, are “new and bold ideas.” He may have a point.
New EU data protection regulations due in 2017 are expected to be cement a “right to be forgotten.” The discussion on how we should regulate and adjust to big data is set to persist.
Photo Credit: Google Search / Google right to be forgotten