Search for your perfect domain name...

Thoughts on Forcing Internet Infrastructure Providers to Moderate Content in the Age of Parler

January 17th, 2021|

|

Share this post

Thoughts on Forcing Internet Infrastructure Providers to Moderate Content in the Age of Parler

Over the past couple of years, the internet infrastructure industry has been sucked into more and more debates around content and what should be online and who should have access to it. 

But before we look at some of the more complex aspects of this, we should probably take a quick look at the basics.

It’s not an easy debate, but it’s a debate that needs to take place. It’s also a debate which tends to produce emotional, yet completely irrational arguments about freedom of speech.

The 1st amendment is a US legal construct, but there are freedom of speech protections in most countries including Ireland. However those freedoms are not absolutes and most importantly they generally refer to government.

Freedom of speech is not absolute. And private companies don’t have to give anyone a platform.

Private companies like ourselves cannot be obliged to provide services to everyone and we have rules and policies around what you can do with our services.

In an ideal world, the website or service operators would self-police and handle content moderation properly. Unfortunately, as we’ve seen, there are some actors who either take an extreme view of freedom of speech or simply do not care.  

This means that calls to get content removed or blocked end up being directed to the hosting providers, domain name registrars, domain name registries and other parts of the infrastructure stack. That trend will continue and there’s very little that anyone can do to stop it. Providers are free to make their own decisions, but I think we will see more calls to protect speech from civil society on the one hand and to swiftly take action on extremist content from government and law enforcement on the other. 

It’s not ideal as we don’t have a scalpel. We only have sledgehammers. As the DNS provider for a site or service, I can’t remove an offending image or video. I can only pull the plug on the entire site or service. If we host a site we might be able to block access to certain images or other content, but more often than not our only option is to pull the plug on the entire site.  

This is a problem, as it means that instead of blocking or removing only the offending content it becomes “all or nothing”. 

When Cloudflare pulled the plug on Daily Stormer, an extremely far-right website, back in August 2017 many of us in the hosting and domain industry agreed that it was the right thing to do. However, some of us were decidedly uncomfortable with how Matthew Prince, Cloudflare’s CEO, explained the rationale behind that decision. It wasn’t that Daily Stormer had gone too far, but that Matthew made an emotional decision. That is far from ideal. When someone signs up with ourselves or one of our competitors, they need to feel confident that we aren’t going to knock them offline on a whim.  

Providers are private companies so they can choose who they do business with. We all have terms of service which set out what we will or won’t allow people to do with the services we provide.  

As businesses, we are conscious of our reputations, but we need to balance both protecting our clients’ speech with societal norms.  

However, it does open up discussions around censorship and restrictions on free speech.  

Industry leaders have been discussing our role in dealing with these issues over the last few years and, as you’d expect, there are a wide range of views. Out of those discussions, a group of us came together to formulate the “DNS Abuse Framework”  which lays out when we will act without court orders or other firm legal obligations. We agree that certain types of content are universally unwelcome so we will take action against things like child sexual abuse material (CSAM often called “child porn”), malware or where there is a clear threat of imminent harm: 

“Specifically, even without a court order, we believe a registry or registrar should act to disrupt the following forms of Website Content Abuse: (1) child sexual abuse materials (“CSAM”); (2) illegal distribution of opioids online; (3) human trafficking; and (4) specific and credible incitements to violence” 

When it comes to other types of content, however, it’s not that simple. 

We have clauses in our terms of service that focus specifically on “hate” sites. But not even The Law Society are happy with how that’s defined.

There’s a “line” of some kind that separates content that I personally don’t like and content that oversteps the line. But where is that line?  Both I and my staff have our personal opinions. I might personally dislike certain groups and what they stand for, but does that mean I can remove their content from the ‘net? What is too far?

It’s far from easy for providers to act as the content moderators.  No matter what we do we will be criticised. There are some who feel that anything “offensive” should be removed from the internet. However “offensive” or “insulting” and many of the other words that people use to describe content are highly subjective.

In case it wasn’t obvious, at the moment hosting providers like ourselves do not actively police the content we host. It would be impractical for us to do so. We rely on reports from 3rd parties.  But not all reports are equal. We get reports from “trusted” 3rd parties that have been properly vetted. They’re much more likely to lead to us taking some kind of action. So for fake pharma we’d rely on Legitscript, for example. Others, however, can be incredibly vague and hard to action. For example, we’ve had a website with over 100 thousand pages reported to us as violating someone’s rights. They failed to state which of the pages was at issue or whose rights were allegedly violated!

This debate is going to rage on for the foreseeable future and we will do our best to engage with it.

I’m personally not overly comfortable with governments legislating online content, but I’m not entirely opposed to it either. Having clear definitions of what constitutes “hate speech” can help us make those hard decisions, because they are hard and maybe they need to be. Yes we have terms of service that we can enforce, but they’re our terms of service and they are for us to enforce on our clients. That means that we need to be able to investigate complaints and if we decide that some or all content linked to a site or service on our platforms needs to go then we need to offer some level of due process. But we also need to have the discretion to act decisively when we feel that such action is merited. We aren’t going to wait around for hours, days or weeks if there’s a threat of imminent harm or if there’s something absolutely heinous online. We aren’t a massive company with thousands of staff and a team of moderators, so we’re not going to get it right all the time, but we will try our best.

Author’s note: a version of this article first appeared in The Irish Independent under the title: “What to host and what not to host online is an ‘all or nothing’ quandary for web providers

Share with a friend!

About the Author: Michele Neylon
Known for his outspoken opinions on technology and the Internet, Michele Neylon is the award winning author of several blogs and co-host of the Technology.ie podcast. A thought leader in the Internet community, Neylon is active within ICANN and an expert on policy, security, domains, ICANN, Nominet and Internet Governance. You can stalk him on various social media networks including Twitter and Instagram

Comments are closed.

Go to Top