The regulatory challenge of content moderation of harmful UGC

Tread carefully, weary traveller, into the realms of regulation for content moderation and UGC...
23 June 2023

Section 230 – a political football, or the basis of future moderation?

• Content moderation is essential to deal with most UGC on commercial platforms.
• Section 230 remains politically contentious.
• A more standardized, global approach would stand a better chance of catching more harmful UGC.

UGC (user-generated content) makes up a large part of the internet. And as the types of content increase – comments sections, blogs, vlogs, social media, and even the Metaverse – the more UGC exists. And harmful UGC has been monetized by bad actors, as well as being the explosive result of anonymity on the internet. The more harmful UGC there is, the more content moderation becomes vital, to protect both individual internet users and the companies that run content platforms, both.

In Part 1 of this article, we spoke to Alex Popken, VP of Trust and Safety at WebPurify, a content moderation company that combines AI and human moderation in its mix, to find out what’s causing the wave, how it’s showing itself – and what can be done to fight it.

In Part 1 we focused more on personal and parental action that could be taken in terms of informal content moderation when faced with harmful UGC.

Towards the end though, we started to turn towards the question of how companies with a platform that allowed UGC could engage with the responsibility of content moderation – and what the complexities of content regulation looked like.

THQ:

What can companies do to protect themselves and their users from harmful UGC, in terms of content moderation?

AP:

Companies that have UGC on their platforms have a responsibility to make sure that they are removing harmful content. And that’s where developing a really robust content moderation setup, leveraging both AI and human review, is critically important, so that they are protecting their users from the harms of their platform.

THQ:

Are enough companies currently doing that?

AP:

I think companies need to be doing more of it.

THQ:

As VP of trust and safety at a content moderation company, you astound us.

AP:

Quite! But the point is that we’ve seen egregious examples of where harmful content flies under the radar and creates real world harm. I believe we’ll see more accountability over time with increased dialogue on the need for regulation.

The US right now is moving quite slowly on that front. But we’re seeing really important regulation coming out of the EU, for example, with the DSA (Digital Services Act), and that’s going to incentivize companies to enact appropriate content moderation practices, particularly in regard to UGC, because if they don’t, they’re going to be fined pretty significantly.

Content moderation of UGC takes more than a village.

Content moderation – long, intensive work to make the internet better. Source: WebPurify

THQ:

Is there a level on which the US is struggling with that more because of its relatively absolute ideas of freedom of speech, whereas other countries and other blocs have “freedom of speech (unless you’re being a harmful idiot)”, essentially?

AP:

You might be onto something there. I think there’s this polarization or partisanship in the US that can make it really difficult for regulation and laws to pass, because you have people on different sides of the aisle taking very different stances on what needs to be done.

The harmful aspect of that is that in the meantime, we are seeing a rise in harmful UGC, and we are seeing the rise of new technologies like generative AI, that need to be regulated. We need a coming-together of regulators and lawmakers to move quickly enough to make sure that we’re addressing these things as soon as possible.

THQ:

That’s the conundrum, isn’t it? Everybody seems to agree with the idea that generative AI needs to be regulated. But the truth is that right now, the technology moves a hell of a lot faster than the law. It’s almost like being Wile E. Coyote trying to catch the Road Runner.

AP:

That’s exactly right. Therein lies the challenge.

THQ:

Which leads us, in a suitably clunky journalistic fashion, to Section 230 and the Supreme Court’s ruling on it. Imagine we work in business procurement and haven’t heard a thing about Section 230. What is it, and why should we care?

AP:

Section 230 is a decades-old internet law in the US that provides legal protection for anyone who moderates or hosts other people’s content online.

Critics say it lets tech platforms off the hook for the harmful content hosted on their site. Whereas supporters say it prevents frivolous lawsuits and creates an open and free internet.

I would say that this has also become quite a partisan issue, where conservatives say it leads to censorship of content, and liberals say it allows platforms to shirk responsibility on removing harmful content.

Recently, a couple of high profile cases were debated at the Supreme Court – Google, and Twitter. Ultimately, the Supreme Court declined to hold them liable for the content posted by their users.

So they passed the buck on changing the breadth of Section 230, which I think was a victory for a lot of the tech companies, who very much feel like Section 230 is important for a free and vibrant internet.

THQ:

We won’t ask you to speak for WebPurify on this, but do you have a personal view on it, as someone with years of experience in content moderation to weed out harmful UGC?

AP:

I think some semblance of Section 230 is needed, because without it, the internet could be reduced to the worst extremes. People would either be incentivized to more heavily censor content, or incentivized to carry everything, harmful or not.

My view though is that we also need some semblance of accountability. That’s where I come back to the concept of regulation. Regulation is needed to hold online platforms accountable for the content that they are moderating, for removing illegal content, and for being transparent about their moderation practices.

That is certainly needed. But I don’t think it has to be an all-or-nothing thing with Section 230. Again, that’s just my own personal view.

THQ:

Coming back to regulation brings up another difficulty, doesn’t it? The internet is pan-global. It doesn’t exactly live anywhere. So by whose rules does internet regulation play?

AP:

Right. And I think that that’s going to be a huge challenge for tech companies, because you’re exactly right – with different countries providing different regulations, it’s going to be almost impossible to keep up.

In an ideal world there would be a global, standardized approach. But of course, in practice, it’s unlikely that we’ll see that play out.

Content moderation requires a particularly…level-headed mindset.

THQ:

Not to get political, but we just heard Republicans’ heads exploding at the idea of global, standardized regulation of the American internet.

But having partially defined the perfect world of content moderation against harmful UGC… how do we get to it? Any idea?

AP:

I think regulation is an important component, because without it, many large social media platforms aren’t incentivized to remove content that is problematic or to prioritize platform health, and trust and safety measures.

Content moderation of UGC is an intensive process around the world.

Content moderators – the thin blue line between the internet and anarchy. Source: WebPurify

I’m not saying that they don’t view that as a priority. But ultimately, a lot of companies are incentivized by user growth and revenue growth. And sometimes that conflicts with trust and safety or content moderation measures.

Imposing regulation that holds platforms accountable and ultimately can impact their bottom line is, I think, an important forcing function.

And again, I don’t want to suggest that companies don’t prioritize this right now. But I think ultimately that if your salary is based upon the ways in which you grow revenue and users and not these important measures around safety, therein lies an inherent conflict of interest that’s really complex.

Content moderation laws help when it comes to UGC.

Laws and regulations – they help if not breaking them is the cost of doing business.

THQ:

That rings bells of familiarity. It reminds us of the regular case with social media platforms and data use, because Meta and co have their own ways of using their users’ data, and they’re fine in the US. And then they do the same thing in Europe, and the GDPR kicks in, and suddenly they’re being fined – now over a billion dollars.

That’s what we were wondering – we’ve said that some level of regulation is important. So there’s a necessary level of liability for platforms and the content posted on them?

AP:

Yes. That’s right.

THQ:

Have we arrived yet at a comprehensive definition of what a social medium is? Whether it’s a news platform, an entertainment platform, an online social venue, or something else? Because there are different liability rules to all of those. So it’s a case of “First, define your problem.”

AP:

Yeah, and I am definitely not an expert in regulation. I believe that, for example, with the DSA, they are coming up with definitions around what constitutes a large platform versus not-a-large-platform, and there are associated fines based upon user base and company size.

When I think about UGC and content moderation, it spans industries. We have clients that are social media platforms, we have clients that are financial service companies, and we have lots of clients in between – it really runs the gamut.

It’s any platform that is hosting UGC in some capacity. And the harms that we see, even though these industries are very different, can be relatively consistent across the board.

But yes, I think that is going to be a challenge for regulators, as they seek to define who is subjected to regulation, you know? Who is considered a platform that warrants rules like that.

 

In Part 3 of this article, we’ll get into the nitty gritty of how content moderation for harmful UGC works in the modern, multimedia environment – and why content moderation may be one of the boom industries of the next few decades.