Do you remember the ‘Right to be Forgotten’?

Next story

Image

Hopefully you remember Google’s ‘Right to be Forgotten’? A year after the EU ruling which led up to it, Google have published some statistics, which make quite interesting reading.


If you had forgotten then here is the article we published about it a few months ago.

As a quick refresher: Google allows folks to request that search result for their name that they deem “inadequate, irrelevant, no longer relevant, or excessive” be removed.

The key term in that sentence is ‘request’ as the internet giant have refused the majority: 58.7% to be precise. In some cases only part of the request was complied with leaving some search results untouched.

The report included 23 example cases. One in which a surgeon requested that news articles about a botched procedure be removed; of the 50 requested only 3 were removed. Another in which the victim of a decades old crime requested that 3 articles discussing it be removed, Google complied fully.


Is 60% unreasonable?


There has been something of a critical response to Google’s transparency report: some decrying it as completely unreasonable, others complaining that there should be no curation of search results.

It’s a difficult one to find a firm positive or negative for: on one hand isn’t it reasonable for the victim of a crime to try and forgot? But should the information remain available for research and public interest purposes?

“I think this particular aspect of privacy on the internet is very difficult to manage and therefore I am quite pleased Google is taking a case-by-case approach,” explains Mark James, ESET security specialist.

“This sort of information can be easily misused but on the other hand a lot of information should be available when searching the world wide web for information on individuals or companies.

“With all the options for placing info on the web it’s very easy for incorrect or malicious information to be created and thus stored.”


Shades of grey


It seems that some of the cases are fairly cut and dry: the victim has a right to forget and be forgotten; the public has a right to know about a surgeons potentially botched past. But it’s the shades of grey that will make or break the ‘right to be forgotten’.

“Data that is incorrect or could be deemed as slanderous should be removed,” as Mark puts it.

“We have the right to be represented fairly but we should also be able to use the internet to gain information when making decisions.

“The internet is a powerful tool but does need to be managed, with the ability for anyone and everyone to add to its content I am a firm believer that quality over quantity should be maintained.”

There are a few questions that remain however: who decides what is slander, what is free speech, what is opinion? What happens if information is removed that turns out to be true? Who ultimately makes the decisions and do we, as the data hungry public, trust the curators?


Join the ESET UK LinkedIn Group and stay up to date with the blog.

What do you think of the transparency report?