Week 4:
Intermediary Liability
An Introduction
Lecture:
- Same sort of question of net neutrality but a different angle- whether or not internet intermediaries should be looked at as
mutual platforms
Key issues:
- What is an ‘intermediary’ or platform, and why might they be (further) regulated?
o ‘Intermediaries shielded from the fact that illegal content exists on their site- but once they become aware, have
to take it down. So dual
- The Status Quo- how digital platforms are being regulated and how they are regulated
o Intermediary Liability Exemptions
o Public Ordering
o Private Ordering - how the intermediary regulates content that they host e.g. having rules that they won’t host
certain content.
The status quo is being widely debated- the concept the they should be treated as neutral is coming
under fire. UN pushing intermediary neutrality. UK- white paper- trying to impose a duty of care onto
intermediaries. Others proposing a normative question.
- Departing from the Status Quo
o UN Recommendations
o The Online Harms White Paper
o Alternative Normative Approaches?
What is an intermediary and why regulate them?
Defining intermediary
- ‘Internet Intermediaries bring together or facilitate transactions between third parties on the Internet. They give access to,
host, transmit and index content, products and services originated by third parties on the Internet or provide Internet-
based services to third parties.’ OECD (2010)
o Wide definition
Intermediaries in the digital ecosystem
GOOGLE SEARCH
CONTENT
PROVIDERS
ADVERTISERS INDIVIDUAL USERS
(INCLUDING
VERTICAL SEARCH)
- Online behavioural advertising- targeting advertising to users
Types of intermediaries:
- Internet access and service providers, offering wired and wireless access to the Internet;
- Data processing and web hosting providers, offering domain names, web site storage and cloud services;
- Search engines and portals, offering aid to navigation;
- E-Commerce intermediaries;
- Internet payment systems to process online payments;
- Participatory networking platforms.
o OECD (2010)
- Participatory networks- the most issues- facilitating freedom of expression - the realm of balancing fundamental rights
and fundamental interests
Types of participatory intermediaries
- From wikis, to blogs and virtual worlds.
Intermediary responsibility
- “So, the problem is to establish to what extent intermediaries that contribute to creating a socially beneficial
communication infrastructure, should be liable for misuse of that infrastructure by their users”.
- European Parliament, Report on Intermediaries (2017)
o Should we find intermediaries liable for the misuse [a very open term] for their platform- illegal use or harmful
use- harmful often picked as it is more vague
o ISPs are treated neutrally - questions whether intermediaries should be allowed the same
,Types of misuse:
Hate Speech:
“A direct attack on people based on what we call
protected characteristics – race, ethnicity, national
origin, religious affiliation, sexual orientation, sex,
gender, gender identity and serious disability or
disease.”
Facebook, Community Standards
Terrorist content:
YouTube “strictly prohibits content intended to recruit for terrorist
organizations, incite violence, celebrate terrorist attacks or
otherwise promote acts of terrorism.”
“content intended to document events connected to terrorist acts or
news reporting on terrorist activities may be allowed on the site with
sufficient context and intent. However, graphic or controversial
footage may be subject to age-restrictions or a warning screen.”
YouTube, Community Guidelines, ‘Violent or Graphic Content’
- Issue= how to define terrorist content
- Legal definition/how YouTube defines will be relevant
o Challenge - separate out lawful expression of terrorism and what is misuse.
o Balancing freedom of expression and harmful materials
Disinformation
Prohibition on the spread of ‘false’ information
disrupting ‘social and economic order’; national
unity; national security.
Companies must monitor networks and report
violations to authorities.
China, Cybersecurity Law 2016
- ‘Fake News’
- Could be how the information is packaged- in order to mislead
- Dependent on what is considered false information
o Who gets to decide what is false?
Is it the state? Then freedom of expression issues
- Companies responsible for monitoring network
Morality based restrictions
- ‘Adult display of nudity and sexual activity’, except for ‘educational, humorous or artistic purposes’. (Facebook)
o Facebook bans woman who shared article on breastfeeding
Other forms of content intermediaries might be held responsible for?
- Defamatory content
o Google- having to delist information that infringes on someone’s data protection
- Copyright protected materials
- Harassment/trolling
- Politically sensitive information e.g. Wikileaks
o Is Wikileaks responsible?
o The authors of the information?
- Challenging for platforms
o Don’t have active control over content that is being posted
o Contrast platforms to traditional media platforms - makes sense to place responsibility onto editors e.g.- but no
ex ante control on platforms
o Need a new way of thinking about these platforms - is it just to impose liability?
Forms of secondary liability
, - Strict Liability: Enabling illegal/harmful activity sufficient
o Immediately responsible
o Could be seen as severe-
- Negligence: Enabling activity, plus knowledge of activity (and awareness of its illegality).
o Enabling the activity in itself won’t be sufficient- but once it has knowledge of the activity [question whether it
has to be aware that it is illegal]- then negligent
- Safe Harbour: No liability, except where there is a specific omission.
o Safe harbour - platforms are neutral as they don’t proactively upload content
- Immunity from sanctions: May be subject to behavioural requirements (terminate activity, mitigate consequences).
o Can never be financially penalised- but can impose behavioural limitations e.g. taking the content down and
stopping its spreading
The status quo:
Illegal speech, harmful speech
- Early 1990s- stage 1
o ‘These different categories of content pose radically different issues of principle, and call for very different legal
and technological responses. It would be dangerous to amalgamate separate issues such as children accessing
pornographic content for adults and adults accessing pornography about children’.
EU Commission, Communication on Illegal and Harmful Content on the Internet (1996)
- Clear that we needed to separate illegal speech and harmful speech
o Criticism: current proposed approach from UK doesn’t separate out harmful and illegal activity
E-Commerce Exemptions:
- 2001: E-Commerce Directive. Implemented in UK through E-Commerce Regulations [KEY PROVISIONS TO
KNOW]
o E-Commerce Directive doesn’t talk about platforms- it applies to “information society services”; defined:
‘"Any service normally provided for remuneration at a distance, by means of electronic equipment for
the processing (including digital compression) and storage of data, at the individual request of a
recipient of the service".
Paid services online - normally provided for renumeration.
o But doesn’t include YouTube etc- thinking of the internet in different times-
assuming
o Platforms- free at the point of access- something the law is struggling with.
But, platforms are economic models- so falling under remit?
It gives 3 broad carve outs for ‘information society services’:
E-Commerce Regulations 2002
Three roles for that an Information Society Service Provider may play:
mere conduit, caching, and hosting.
Mere conduit – Reg. 17
The ISS does no more than transmit data; in particular, the ISS does not
• initiate transmissions,
• select the receivers of the transmissions,
• select or modify the data transmitted
When acting as a mere conduit, the ISS cannot be liable for damages or for any criminal sanction as a
result of a transmission.
- Here- all the ISP does is translate and send data- cannot be held liable
o Won’t object to this
2.
- Caching – Reg. 18
o When the information transmitted is the subject of automatic, intermediate and temporary storage, for the sole
purpose of increasing the efficiency of the transmission of the information to other recipients of the service upon
their request.
Then ISP won’t be held liable
Wanting to distinguish between temporary technical copying of information for purposes of
transmission and copying because you want to sell on black market and make $ for advertising.
- Hosting – Reg. 19- exemption for hosting illegal content until the point of where they’re made aware. Called Notice and
Takedown
o Where an ISS stores information provided by its customers. It is not liable for damage or criminal sanctions
provided that:
o It did not know that anything unlawful was going on;