Free speech does not mean the right to shout fire in a crowded theatre. But, what do you do with someone whose speech inspires people to set fire to a 5G mast ?
The person is question is David Icke, whose official Facebook page was taken down recently because of his role in spreading the conspiracy theory that 5G is related to coronavirus. This theory has led to arson attacks on 5G mast, and assaults on telecoms workers.
While Icke is undeniably a controversial figure, the move has reignited debates about free speech, and the growing power of social media companies dictating what is considered acceptable freedom of expression online. And this debate will only grow as the social media companies fight to move against coronavirus misinformation, leading to inevitable confusion and even mistakes in its attempts to be both judge and guardian.
An UnHerd interview with the outspoken yet hardly controversial, Dr Karol Sikora on coronavirus, has also just been removed from YouTube, which is owned by Google. Sikora is a highly respected oncologist and former adviser to WHO and his daily, often optimistic, commentary on the impact of the virus has been considered informative. The initial appeal to reinstate Sikora’s interview was rejected, but then the video was suddenly reinstated.
In many ways social media companies are in a no-win situation. If they do not remove content that inspires arson, or even genocide in Myanmar, they are accused of complicity. Critics argue they give hatemongers megaphones in the form of algorithms which drive user engagement by peddling outrage.
However, if they do act free speech advocates rightly question whether vast corporations, accountable to no one except their shareholders, should be allowed to dictate the limits of acceptable speech in large parts of the public sphere.
Now Facebook has decided to outsource making those controversial decision by creating a new Independent Oversight Board. While Facebook will handle day to day moderation, the board can be appealed to and will make binding decisions based on “respect for freedom of expression and human rights”.
The new Board appears to have some teeth. Facebook is, to an extent, bound to accept the its decisions while its funding is provided by an independent trust. Members cannot be got rid of by Facebook.
Facebook played a role in choosing the initial twenty board members but from now on they will choose new members. The current membership includes fierce critics of Facebook and there has been a serious attempt to balance views, both ideologically and geographically. For example, the former Guardian editor, Alan Rusbridger, is now the unlikely colleague of John Samples, the vice-president of the libertarian Cato Institute.
Yet, despite this attempt at independence, the new body fails to resolve the fundamental issues at stake, while throwing up new ones.
One question is the extent of the board’s actual powers. While Facebook is bound to accept any individual decisions made by the board, it can also reject suggested company policy changes. Given the vast number of content decisions Facebook makes every day, the board is bound to focus on a few cases with broader implications. But, using a single case to set a broad precedent might also imply a policy change.
This ambiguity is vital. As the policy change caveat shows, Facebook is not willing to let a separate body unilaterally dictate major changes to its business model, suggesting the board’s ability to set broad precedents will be limited. However, if it is limited to symbolic rulings on single pieces of content the board is toothless.
If the board does actually end up wielding significant power, the question then is would this independent self-appointed body be better suited to wield vast censorial powers than Facebook ? Its decision-making process is designed to be transparent and its members are eminent. None of this changes that this board is arguably even less accountable to anyone than Facebook itself.
Indeed, its independence exacerbates the issue of unaccountability. When Rusbridger stated, in a post explaining his decision to take a position on the board, that governments have a bad track record when it comes to regulating speech, he was right. However, the former journalist fails to explain why an unaccountable board of philosopher kings will have a better one.
If we want accountability, regulation must in the hands of democratic governments. While this carries risks of its own, it would provide some mechanism for popular input. In practise, we have long allowed governments to place some limits on free expression through laws against defamation, misleading advertisements or hate speech. These issues are controversial but that is healthy, scrutiny helps push back against overreach.
Admittedly applying these standards to social media which covers the world and blurs the line between public and private would be difficult. Facebook’s move to create its own board stems partly from its frustration with the lack of legal guidance that accompanies the criticism it receives. Furthermore, the regulations the UK government is putting together are also flawed due to dangerously wide remits, vague definitions, and rely on social media companies to do the heavy lifting of enforcement. Yet, unless democratic governments step-up with alternative systems of governance, the social media giants will continue to wield their censorship powers.