In its decision of 9 March 2017 in Rolf
Anders Daniel Pihl v Sweden, the ECtHR has clarified the limited liability of
operators of websites or online platforms containing defamatory user-generated
content. The Court’s decision is also interesting in the context of the current
discussion on how to prevent or react to ‘fake news’, and the policy on
involving online platforms in terms of liability for posting such messages.
Although the Court’s ruling expresses concerns about imposing liability on
internet intermediaries that would amount to requiring excessive and
impractical forethought, which is capable of undermining the right to impart
information via the internet, the decision in Pihl v Sweden itself
guarantees only minimal protection for the rights of internet intermediaries
and users’ rights.
Context
The decision in Pihl v
Sweden builds on the judgments in Magyar Tartalomszolgáltatók
Egyesülete and Index.hu Zrt v Hungary and the Grand Chamber’s judgment
in Delfi AS v Estonia.
This time however the applicant was not the owner or operator of an online
platform with users’ comments complaining about an interference with its right
to freedom of expression under Article 10 of the Convention. In Pihl v Sweden the applicant complained
of a breach of his right to privacy and reputation under the ECHR, Article 8
because the Swedish authorities had refused to hold the operator of a website
liable for a defamatory blog post and an anonymous online comment.
Facts
A blog post on a Swedish website was published accusing Mr.
Pihl of being involved in a Nazi party. The blog on which the post appeared was
a small one run by a non-profit association. Although the blog allowed comments
to be posted, it was clearly stated by the operating association that such
comments were not checked before publication and that commentators were
responsible for their own statements. Commentators were therefore requested to ‘display
good manners and obey the law’. The day after publication of the post, an
anonymous person posted a comment stating that ‘that guy Pihl is also a real
hash-junkie according to several people I have spoken to’.
Nine days later Pihl posted a comment on the blog in reply
to the above comment and blog post about him, stating that the information in
the blog post and comment was wrong and should immediately be removed. The
following day the blog post and the comment were removed and a new post was
added on the blog by the association – stating that the earlier post had been
wrong and based on inaccurate information – and it apologised for the mistake.
However, Pihl sued the association and claimed symbolic damages of 1 Swedish
krona. He submitted that the post and the comment constituted defamation, and
that the association was responsible for the fact that both the blog and the
comment remained on the website for nine days.
The District Court rejected Pihl’s claim. It found that the
comment constituted defamation, but it found no legal grounds on which to hold
the association responsible for failing to remove the blog post and comment
sooner than it had done. The Court of Appeal confirmed this judgment and the
Supreme Court refused Pihl’s leave to appeal. A short time later, Pihl lodged
an application with the Chancellor of Justice for payment of damages on the
basis that the Swedish State had failed in its positive obligations under the
ECHR, Article 8 through the national courts’ decision not to hold the
association responsible for the defamatory comment against him. In 2015 the
Chancellor of Justice rejected the application as he found that it could not be
deduced from the European Court’s case-law – Delfi AS v Estonia
was pending before the Grand Chamber at that time – that there was an absolute
obligation on States to have legislation in place, in each individual case,
enabling the person responsible for a blog and the comments on it, to be held
accountable.
It is worth mentioning that defamation in Sweden is still a
criminal offence under Chapter 5, Section 1 of the Penal Code. However there is
also specific legislation on liability and obligations for removal upon notice
of certain content on online platforms. Section 5 of the Act on Responsibility
for Electronic Bulletin Boards, concerning the obligation to erase certain
messages, states:
‘If a user submits a message to an
electronic bulletin board, the supplier of the service must remove the message
from the service, or in some other way prevent its further dissemination, if
the message content is obviously
such as is referred to in the Penal Code, Chapter 16, Section 5, about inciting
rebellion, Chapter 16, Section 8, about agitation against a national ethnic
group, Chapter 16, Section 10a, about child pornography crime, or Chapter 16,
Section 10b, about unlawful depiction of violence, or
it is obvious that the user has,
by submitting the message, infringed the copyright or other right protected by
Section 5 of the Copyright (Artistic and Literary Works) Act (1960:729).’
As the duty to erase some types of obvious illegal content,
such as hate speech, child pornography, incitement to violence and copyright infringements,
did not include defamation or breach of privacy, Section 5 of the Act on
Responsibility for Electronic Bulletin Boards was not applicable in the case at
issue, while the association or its legal representative could not be convicted
of defamation, either as the principal or as an accomplice, according to the
Penal Code or Section 5 of the Act.
Pihl complained under Article 8 of the Convention that the
fact that Swedish legislation prevented him from holding the association
responsible for the defamatory comment had violated his right to respect for
his private life.
The Court’s reasoning
and decision
First the Court reiterates that a person’s right to
protection of his or her reputation is encompassed by Article 8 as part of the
right to respect for private life, while in order for Article 8 to come into
play, the attack on personal honour and reputation must attain a certain level
of seriousness and must have been carried out in a manner causing prejudice to
personal enjoyment of the right to respect for private life. The Court
considers that the comment, although offensive, certainly did not amount to
hate speech or incitement to violence, but it accepts the national courts’
finding that the comments at issue constituted defamation and, consequently,
fell within the scope of Article 8.
Next, the Court refers to its Grand Chamber judgment in Delfi AS v Estonia in
which it explained how to balance the conflicting rights protected under
Articles 8 and 10, including its approach that ‘where the balancing exercise
between those two rights has been undertaken by the national authorities in
conformity with the criteria laid down in the Court’s case-law, the Court would
require strong reasons to substitute its view for that of the domestic courts’
(see also Axel Springer AG v Germany
and Von Hannover (No. 2) v
Germany). Referring toMagyar Tartalomszolgáltatók
Egyesülete and Index.hu Zrt v Hungary and protagonists playing an
intermediary role on the internet, the Court sums up a set of specific aspects
that are relevant for the concrete assessment of the interference in question: ‘the
context of the comments, the measures applied by the company in order to
prevent or remove defamatory comments, the liability of the actual authors of
the comments as an alternative to the intermediary’s liability, and the
consequences of the domestic proceedings for the company’ (at [28]).
Analysing these factors step by step, the ECtHR scrutinises
whether the Swedish judicial authorities achieved a fair balance between Pihl’s
right to respect for his private life under Article 8 and the association’s
right to freedom of expression guaranteed by Article 10 of the Convention.
As regards the context of the comment, the Court notes that
the underlying blog post accused Pihl, incorrectly, of being involved in a Nazi
party. However, the post was removed and an apology published when the
applicant notified the association of the inaccuracy of the post. The comment
about Phil being a ‘real hash-junkie’ did not concern his political views and
had nothing to do with the content of the blog post. It could therefore hardly
have been anticipated by the association. The Court attaches particular
importance to the fact that the association is a small non-profit association,
unknown to the wider public, and it was thus unlikely that it would attract a
large number of comments or that the comment about Pihl would be widely read.
The ECtHR also considers that ‘expecting the association to assume that some
unfiltered comments might be in breach of the law would amount to requiring
excessive and impractical forethought capable of undermining the right to
impart information via internet’ (at [31]). As regards the measures taken by
the association to prevent or remove defamatory comments, the Court notes that
it was clearly stated on the blog that the association did not check such
comments before they were published and that commentators were responsible for
their own statements. Commentators were also requested to display good manners
and obey the law. Moreover, the Court observes that the association removed the
blog post and the comment one day after being notified by Pihl that the post
was incorrect and that he wanted the post and the comment removed. The
association furthermore posted a new blog post with an explanation for the
error and an apology. The Court also refers to its earlier case law in which it
held that ‘liability for third-party comments may have negative consequences on
the comment-related environment of an internet portal and thus a chilling
effect on freedom of expression via internet. This effect could be particularly
detrimental for a non-commercial website’. Turning to the liability of the
originator of the comment, the Court observes that Pihl obtained the IP-address
of the computer used to submit the comment. However, he has not stated that he
took any further measures to try to obtain the identity of the author of the
comment. Lastly the Court notes that Pihl’s case was considered on its merits
by two judicial instances at the domestic level before the Supreme Court
refused leave to appeal. Moreover, the Chancellor of Justice examined Pihl’s
complaint under Article 8 of the Convention, referring to the Court’s case-law
and the need to balance the interests under Article 8 and Article 10, before finding
that the case did not disclose a violation of his rights under Article 8. The
Court further observes that the scope of responsibility of those running blogs
is regulated by domestic law and that, had the comment been of a different and
more severe nature, the association could have been found responsible for not
removing it sooner.
In its overall conclusion the ECtHR emphasises the fact that
the comment, although offensive, did not amount to hate speech or incitement to
violence and was posted on a small blog run by a non-profit association which took
it down the day after the applicant’s request and nine days after it had been
posted. In view of this, the Court finds that the domestic courts acted within
their margin of appreciation and struck a fair balance between the applicant’s
rights under Article 8 and the association’s opposing right to freedom of
expression under Article 10. Therefore the Court found the application manifestly
ill-founded.
Comments
In Pihl v Sweden,
in contrast with Delfi AS v Estonia and
Magyar Tartalomszolgáltatók
Egyesülete and Index.hu Zrt v Hungary, the ECtHR does not seem to
require, at least not from small non-profit associations operating a website or
online platform open for users’ comments, the pre-monitoring of all content nor
that an effective notice-and-take-down system is installed. Indeed, in the case
at issue, it was clearly stated by the operating association that the content
of users’ comments was not checked before publication and that commentators
were responsible for their own statements, while only being requested to ‘display
good manners and obey the law’. There are no indications that a procedure for
notice-and-take-down was installed, although clearly the platform reacted
promptly to remove the incorrect, offensive and defamatory messages, even
accompanied with an apology by the association. This reaction in itself was
enough to exonerate the association from liability in the context of the case
at issue. The Court’s decision in the case of Pihl v Sweden leaves open the question whether such a removal,
eventually accompanied with an apology, is necessary to exonerate the operators
from liability, or whether in other situations a rectification, right of reply
or other way of correcting the ‘false’ allegations might be a more appropriate,
sufficient and proportionate way in respect of the internet intermediaries’ and
their users’ right to freedom of expression and information.
A crucial, if not decisive element in the Court’s ruling, is
derived from the content itself of the contested online messages, which
according to the ECtHR ‘did not amount to hate speech or incitement to violence’.
This approach by the Court, in line with Magyar, confirms the
need to make a distinction in the levels of liability for internet
intermediaries. Indeed online internet platforms should only be held liable
when they have failed to act expeditiously when it concerns ‘clearly unlawful
content’, and more precisely when illegal hate speech and incitement to
violence has been posted on their website or platform. This means that in cases
of breach of privacy, libel or defamation internet intermediaries are not to be
held liable for users’ comments when, upon notice, the messages at issue have
been promptly removed. The decision in Pihl v Sweden makes this exoneration
categorical, even in cases of clear defamatory comments without any public
interest, while in Magyar the Court still
referred to the specific characteristics of the comments at issue, holding that
they were related to a matter of public interest, using a style that was common
in communication on many internet portals and only concerned the
commercial reputational interests of a company.
The consequence of the decision in Pihl v Sweden remains
however the same as in the Court’s previous judgments in Delfi and Magyar: an
expeditious removal upon notice of comments with ‘clearly unlawful content’,
such as illegal hate speech and incitement to violence, will not exonerate
internet intermediaries from liability, as the exoneration is only valid in
cases where it does not concern such content. This means that operators of such
websites can only fully protect themselves against criminal liability for
incitement to hatred, discrimination or violence, by installing a system of
pre-monitoring of all users’ comments and by filtering or removing clearly
unlawful content on their own initiative.
The most important characteristic of the decision in Pihl however is that it seems to reserve
the limited liability for defamatory users’ comments only for small non-profit
websites. While in its judgment of 2 February 2016 in Magyar the
ECtHR did not connect decisive consequences to the different characteristics of
the online platforms at issue (Index.hu Zrt being run by a commercial company
and being one of the major Internet news portals in Hungary, while MTE is a
non-commercial website of a self-regulatory body of Internet content
providers), in Pihl the ECtHR
emphasises the small and non-profit character of the association at issue as a
crucial aspect in limiting or even conditionally excluding its liability. By
narrowing the exoneration to small not-profit operators of online platforms,
the Court’s case law leaves a broad opening for the Member States to impose
liability on all other online platforms for user-generated content, including
defamatory content, even in cases of expeditious removal upon notice.
We have previously expressed concern about pushing internet
intermediaries further in the direction of private censorship, and that the
burden on private actors to pre-monitor user generated content and eventually
remove some of it, with a lack of clear criteria, a lack of transparency and no
effective procedural guarantees creates a clear and present danger for the
right to freedom of expression on the Internet. This concern is amplified in
the context of current policies imposing more liability on internet
intermediaries for content that can be considered as ‘fake news’, propaganda
and hate speech. Similar concerns have recently been uttered in the Joint
Declaration on Freedom of Expression and ‘Fake News’, Disinformation and
Propaganda by the United Nations (UN) Special Rapporteur on Freedom of
Opinion and Expression, the Organization for Security and Co-operation in
Europe (OSCE) Representative on Freedom of the Media, the Organization of
American States (OAS) Special Rapporteur on Freedom of Expression and the
African Commission on Human and Peoples’ Rights (ACHPR) Special Rapporteur on
Freedom of Expression and Access to Information. This joint declaration of 3
March 2017 put forward as a general principle that ‘intermediaries should never
be liable for any third party content relating to those services unless they
specifically intervene in that content or refuse to obey an order adopted in
accordance with due process guarantees by an independent, impartial,
authoritative oversight body (such as a court) to remove it and they have the
technical capacity to do that’.
Dirk Voorhoof, Human Rights Centre Ghent University (Belgium),
Copenhagen University (Denmark), Legal Human Academy and member of the
Executive Board of the European Centre for Press and Media Freedom (ECPMF,
Germany)
This is an edited version of a blog post which originally
appeared on the Strasbourg
Observers blog and is reproduced with permission and thanks.