March 7, 2021

The Slope Gets More Slippery As You Expect Content Moderation To Happen At The Infrastructure Layer

By Konstantinos Komaitis
What
a week the first week of January has been! As democracy and its
institutions were tested in the United States, so was the Internet
and its actors.

Following
the invasion of the Capitol Hill by protesters, social media started
taking action in what appeared to be a rippled effect: first, Twitter
permanently suspended the account of the President of the United
States, while Facebook
and Instagram
blocked his account indefinitely and, at least, through the end of
his term; Snapchat
followed by cutting access to the President’s account, and
Amazon’s video-streaming platform Twitch
took a similar action; YouTube
announced that it would tighten its election fraud misinformation
policy in a way that it would allow them to take immediate action
against the President in the case of him posting misleading or false
information. In the meantime, Apple
also announced that it would kick off Parler, the social network
favored by conservatives and extremists, from its app store on the
basis that it was promoting violence associated with the integrity of
the US institutions.

It
is the decision of Amazon,
however, to kick off Parler from its web hosting service that I want
to turn to. Let me first make clear that if you are Amazon, this
decision makes total sense from a business and public relations
perspective – why would anyone want to be associated with
anything that even remotely hinges on extremism? The decision also
falls within Amazon’s permissible scope given that, under its
terms of service, Amazon reserves the right to terminate users from
their networks at their sole discretion. Similarly, from a societal
point of view, Amazon may be seen as upholding most peoples’
values. But, I want to offer another perspective here. What about the
Internet? What sort of a message does Amazon’s decision send to
the Internet and everyone who is watching?

There
are several actors participating
in the way a message
– whether an email, cat video, voice call, or web page –
travels through
the Internet. Each one of them might be considered an “intermediary”
in the transmission of the message.
Examples
of Internet infrastructure intermediaries include Content Delivery
Networks (CDNs), cloud
hosting services, domain
name registries, and registrars. These
infrastructure actors are responsible for a bunch of different
things, from managing
network infrastructure, to
providing
access to users, and ensuring
the
delivery of content. These
– mostly

private sector companies provide investment
as well as reliability
and upkeep of the services we all use.

In
the broadcasting world, a carrier also controls the content that is
being broadcasted; in the Internet, however, an actor responsible for
the delivery of infrastructure services (e.g. an Internet Service
Provider or a cloud hosting provider) is unlikely or not expected to
be aware of the content of the message they are carrying. They simply
do not care about the content; it is not their job to care. Their one
and only responsibility is to relay packets on the Internet to other
destinations. Even if, for the sake of the argument, they were to
care, at the end of the day, they are not the producers of the
content. Like postal
and telephone services, they have the essential role of carrying the
underlying message efficiently.

Over
the past year, the role and responsibility of intermediaries has been
placed under the policy microscope. The focus is currently on
user-generated content platforms, including Facebook, Twitter and
YouTube. In the United States, policy makers in both sides of the
aisle have been considering anew the role of intermediaries in
disseminating dis- and mis-information. Section, 230, the law that
has systematically, consistently and predictably shielded online
platforms from liability over the content their users post has been
highly politicized
and change now is almost inevitable. In Europe, after a year of
intense debate, the newly released Digital
Services Act
has majorly upheld the long-standing intermediary liability regime,
but, still, there are implementation details that could see some
change (e.g. the whole provisions on ‘trusted flaggers’).

It
is the actions like the one that Amazon took against Parler, however,
that go beyond issues of just speech and can set a precedent that
could have an adverse effect in the Internet and its architecture. By
denying to provide cloud hosting services, Amazon is essentially
taking Parler offline and denying its ability to operate, unless the
platform can find another hosting service. This might be seen as a
good thing, prima
facie;
at the end of the day, who wants such content to even exist, let
alone circulate online? But, it does send a quite dangerous message:
as infrastructure intermediaries can take action that cuts the
problem from its root (i.e. getting a service completely offline),
regulators might start looking at them to “police” the
Internet. In such a scenario, infrastructure intermediaries would
have to deploy content-blocking measures, including IP
and protocol-based blocking, deep packet inspection (i.e. viewing
content of “packets” as they move across the network),
and URL and DNS-based blocking.
Such
measures ‘over-block’, imposing collateral damage on
legal content and communications. They also interfere with the
functioning of critical Internet systems, including the DNS, and
compromise Internet security, integrity, and performance.

What
Amazon did is not unprecedented. In 2017, Cloudflare took a similar
action against the Daily Stormer website when it stopped answering
DNS requests for their sites. At the time, Cloudflare
said:
“The
rules and responsibilities for each of the organizations
[participating
in Internet] in regulating content are and should be different.”
A few days later, in an op-ed,
published at the Wall Street Journal, Cloudflare’s CEO, Matthew
Prince said: “I
helped kick a group of neo-Nazis off the internet last week, but
since then I’ve wondered whether I made the right decision.[…] Did
we meet the standard of due process in this case? I worry we didn’t.
And at some level I’m not sure we ever could. It doesn’t
sit right to have a private company, invisible but ubiquitous, making
editorial decisions about what can and cannot be online. The
pre-internet analogy would be if Ma Bell listened in on phone calls
and could terminate your line if it didn’t like what you were
talking about.”

Most
likely Amazon faced the same dilemma; or, it might have not. One
thing, however, is certain: so far, none of these actors appears to
be considering the Internet and how some of their actions may affect
its future and the way we all may end up experiencing it. It is
becoming increasingly important that we start looking into the
salient, yet extremely significant, differences between moderation
happening by user-generated content platforms as opposed to moderation
happening by infrastructure providers.

It
is about time we make an attempt to understand how the Internet
works. From where I am sitting, this past year has been less lonely
and semi-normal because of the Internet. I want it to continue to
function in a way that is effective; I want to continue seeing the
networks interconnecting and infrastructure providers focusing on
what they are supposed to be focusing on: providing reliable and
consistent infrastructure services.

It
is about time we show the Internet we care!

Dr. Konstantinos Komaitis is the Senior Director, Policy Strategy and Development at the Internet Society.

Source:: https://www.techdirt.com/articles/20210111/09253546032/slope-gets-more-slippery-as-you-expect-content-moderation-to-happen-infrastructure-layer.shtml