French Court Convicts Google of Defamation for Providing Search Terms that Suggested Plaintiff was a Rapist

September 27th, 2010

It will be done right to request removal of suggestions and proposals at issue in a fine of 500 € per breach per day, at the expiration of a period of one month from the service of this decision.From the UK Telegraph:

Google has been convicted of defaming a French computer user after the Internet technology giant linked his name to the word “rapist” in automatic web searches.

Court documents said the function, which suggests options and phrases as a user types, linked the man’s identity to words including “rapist”, “satanist”, “rape,” and “prison”.

According to French reports, the man was convicted of the “corruption of a minor” and sentenced to three years in jail earlier this year.

But in what is believed to be a landmark decision, the Superior Court of Paris found Google guilty of the “public slandering of a private individual”. Google said it will appeal the decision, which also named Eric Schmidt, its chief executive.

The court ruled the man had been defamed because he is considered innocent under French law until all of his appeals have been exhausted.

Google disagrees, and argues that it is not liable because it has no control over its search engine (lol):

A Google spokeswoman said the company would appeal the French court decision.

She said the “Google Suggest” function reflected the most common terms used in the past with words entered and were not being suggested by the company.

“These searches are algorithmically determined based on a number of purely objective factors including (the) popularity of search terms,” she said.

“Google does not suggest these terms. All of the queries shown in Autocomplete have been typed previously by other Google users.”

Oh the irony. Just last month Eric Schmidt suggested that in the future people would have to change their name in order to escape their online reputations. I’m sure Google would recommend that the Plaintiff simply change his name. That would solve all of the problems, right?

I would also quibble with Google’s notion that their “suggest” function is totally autonomous. Skynet is not online–yet. At some level, an actual human being programmed that feature,and (putting aside all legal issues), I am sure Google could program their servers to deliver modified results.

This cases raises some serious privacy issues. If Google were to create a static web page linking this person to the word rape, a libel case would be an easier call. But Google is using their intuitive powers to guess what people are thinking to suggest this person in fact may be a rapist. Are these two situations analogous? And if so, what should we do about the possible threats to a person’s reputation?

As I wrote in Omniveillance:

One of the greatest threats that a lack of privacy protections in public places poses is the potential to damage a reputation.167 Personal information that is taken out of context can often lead to unfair judgments that can prevent learning more about a person’s character.168  …  Like an elephant, the Internet never forgets

Dan Solove has also written:

The virtue of knowing less shows that “[a]lthough more information about a person might help enrich our understanding of that person, it might also lead us astray, since we often lack the whole story.”177

Similarly, Blackstone wrote:

The security of his reputation or good name from the arts of detraction and slander, are rights to which every man is intitled [sic], by reason and natural justice; since without these it is impossible to have the perfect enjoyment of any other advantage or right.

And Shakespeare had this to say in Othello:

Reputation, reputation, reputation! O, I have lost my reputation! I have lost the immortal part of my self and what remains is bestial.

H/T Gizmodo

Update: I found the opinion and used (I appreciate the irony) Google Translate to translate it to English. I do not speak French so I cannot vouch for the accuracy of the translation.

Here is how Google characterizes the autonomous nature of their servers:

Eric S. Google Inc. and produce a certificate of David K., responsible for these products, indicating:
- that they operate in a purely automatic from a database which lists the requests actually entered on Google during the recent period by a minimum number of users with similar preferences linguistic and territorial
- the results displayed depend on an algorithm based on queries by other users without any human intervention or reclassification of these results by Google
- that the order of requests is entirely determined by the number of users who have used each of the queries, the most frequently appearing on the list.

Here are some of the French court’s observations, comparing the Google search to the Yahoo search, which generated different results:

It will be noted in advance on the technical argument of the defendants:
- that the algorithms or software solutions proceed from the human mind before being implemented,
- it is interesting to observe, with the applicant, a similar service offered by another search engine (Yahoo) book, for an identical search on his name and surname, the results quite different,
- that far from the technological neutrality of the two services called into question by their wording, the research items in question are undoubtedly capable of directing the curiosity or to call attention to the themes they offer or suggest, and in doing , likely to cause a “snowball effect” even more injurious to which it relates that the language most catchy and will end soon top the list of proposed research,

And what is the remedy? Specific performance.

It will be done right to request removal of suggestions and proposals at issue in a fine of 500 € per breach per day, at the expiration of a period of one month from the service of this decision.

Yeah, let’s see Google listen to this Court.