Blog content

How Elon Musk could change Twitter’s content moderation

Elon Musk’s takeover of Twitter raises the issue of social media content moderation in particularly urgent form. Despite the regulations that are announced in the UK and the European Union, which Musk’s Twitter must comply with, no legal requirement will prevent Musk from operating Twitter according to the editorial policy he chooses to adopt. It’s his candy.

How is it possible? Is it really true that the content moderation policies of such a powerful forum for public discourse must depend on the whims of its billionaire new owner? Evan Greera political activist from Fight for the Future, speaks for many of us when she says, “If we want to protect free speech online, we can’t live in a world where the wealthiest person on the planet can simply buy a platform that millions of people depend on and then change the rules to their liking.

But that is how television, newspapers and radio operate in liberal democracies. Media owners determine the political line of the stories and commentaries they broadcast. When NBC, CNN, ABC or the New York Post change owners, as they have often done in the past, their new owners dictate operational rules and editorial policy. Social media is media and the same ownership prerogatives apply. Content moderation is their editorial policy, and it is determined by their owners. No liberal democracy will dictate what owners can do or what their editorial policy should be.

Of course, some speech is illegal, and increasingly social media companies will need to keep their systems free of illegal material. The UK and EU are creating new liability regimes for illegal speech in their current legislation, and Musk has promised to comply with those legal requirements.

But most hate speech, misinformation and racist invective on social media is legal here in the US and Europe. Musk will have to comply with new EU and UK harmful but legal speech laws; it will mean more risk assessments, transparency reporting, audits, access to data for researchers, publishing content moderation standards and due process requirements.

These new laws will enforce the vital public protection of transparency. It would be desirable to adopt significant elements here in the United States. But they won’t dictate Elon Musk’s approach to content moderation on Twitter. They still allow him to let his system fill up with harmful material if he wants.

So what is Musk likely to do with Twitter? He presents himself as a philanthropic guardian of a public resource. In an onstage interview at TED2022, Musk mentioned, “It’s not a way to make money. My strong intuitive feeling is that it is extremely important for the future of civilization to have a public platform of maximum trust that is broadly inclusive. I don’t care about the economy at all.

He appears of wanting to allow any legal speech on the platform and it has raised concerns about the weakening of content moderation in the name of free speech. But the Wall Street Journal opinion columnist Holman W. Jenkins Jr.. summarizes the current situation that “Twitter has crossed the river of no return by ‘moderating’ the content that appears on its service – it cannot allow free and unfettered expression.”

However, just because someone has to moderate content on Twitter doesn’t mean Twitter has to. Musk could outsource the work to Twitter users or third parties.

Influential neo-right blogger Curtis Yandex urged Musk to take a user curation approach to content moderation. The new Twitter under Musk, he says, must censor “all content prohibited by law in all jurisdictions that prohibit it.” For content moderation and algorithmic recommendation of legal speech, Yandex urges Musk to seek to identify hate speech and other speech that users might not want to see, then give users the tools to block it if they wish. The goal should be to organize content moderation and algorithmic recommendation to give users what they want, to make their experiences “as rich and enjoyable as possible”.

This idea still leaves Twitter in charge of identifying harmful material that users might not want to see. But there might be a way to outsource that too.

Musk says he wanna make Twitter’s algorithms “open source to increase trust”.

Twitter recommendation algorithm “should be on GitHub”, it noticed. This may mean more than allowing users to examine the algorithm to see how it works. Users can modify Twitter’s open-source algorithm as they wish.

This raises an interesting possibility for the future of Twitter. Musk may consider adopting content moderation approach recommended by political scientist Francois Fukuyama. This “middleware” approach would install an “editorial layer” between a social media company and its users. It would outsource “content curation” to other organizations who would receive the entire feed from the platform and filter it according to their own criteria, then make that curated feed available to their own users.

Musk’s talk about providing all the legal talk would then apply to the basic Twitter feed. Content moderation beyond that would be outsourced to users and third-party content curation service providers.

There’s no way to know at this point if Musk intends to move towards this user-centric approach to content curation. This issue is so fraught that outsourcing content moderation might be worth experimenting with. My own feeling is that this seems far-fetched. It is not at all clear that this is technically feasible and there is no discernible way to generate revenue to pay for the moderation costs involved. Each content curation service middleware provider would have to duplicate a huge infrastructure of software and human moderators, which seems economically implausible.

Additionally, as a Stanford University jurist Daphne Keller noted, privacy issues need to be resolved. Does the middleware provider have access to all material posted by a user’s friends and followers? If so, it infringes on the privacy of those other users who might not want anything to do with that middleware provider. If not, how can the middleware provider effectively filter the News Feed?

More importantly, this idea is not a way to foster a genuine exchange between citizens on matters of public importance. It’s more of a recipe for us to retreat to our corners, creating filter bubbles of like-minded people and excluding the rest of society.

Separating ourselves so that we don’t have to listen to people who differ from us is no cure for the information externalities that make hate speech and disinformation so dangerous even for people who aren’t exposed to it. People cannot remain indifferent to what others in society believe because what others believe affects them. If enough people reject vaccines and other public health measures, we all risk the next pandemic. If enough people become racist or intolerant of the LGBTQ community, significant parts of our community will not be safe in their own society. And how are we going to agree on what to teach our children if there is no uniform public platform where we can exchange ideas?

The big benefit of Musk’s takeover of Twitter is that it will draw attention to new ways to improve content moderation. The disappointment, for many, is that other than offering advice and demanding transparency, there is little the public or policymakers can do to sway Musk’s decision on what to do with his new store. sweets. He owns the platform and, as is the case in business in general, he is free to make the decisions he wants.