What Social Media Companies Must Do to Protect Democracy in the Wake of the Capitol Riots
Technology was supposed to lead us to nirvana. But here we are in 2021, with democracy in peril. How could it have gone so astray, and where do we go from here?
The attack on the Capitol is a watershed moment for tech because it brings a range of uncomfortable questions about social media’s role in society to the fore. It also dramatically raises the stakes of the conversation because the fate of American democracy truly hangs in the balance. What role has social media played in the spreading of falsehoods? If platforms can clearly be weaponized, what responsibility do tech companies have in preventing this outcome? How do platforms reconcile the fact that polarizing content is good for engagement but bad for democracy? These are just some of the questions we’re currently debating.
Tech companies have a responsibility to ensure that their products, services and applications do not cause undue harm to the public. Unfettered social media harms the public because the erosion of truth is directly related to the erosion of democracy. Platforms — whether by their own choice or not — are positioned such that their content moderation has outsized influence on the health of our overall information ecosystem. Conspiracy theories threaten democracy, and discord that is sown online is a danger to the republic. Americans’ approach to free expression has relied on our faith in the marketplace of ideas, which asserts that good ideas defeat bad ones when left to battle it out. This faith was embedded in the rise of social media, as its very creation was driven by Americans who took a light touch toward content moderation due to a belief that the marketplace of ideas works (see Mark Zuckerberg’s Georgetown University speech). This faith has now been shaken, and it’s clear that a light touch toward content will no longer work in 2021.
A dramatic reckoning is now underway as Facebook and Twitter are coming under fire for their approach to content moderation. Why has the rise of social media across the globe coincided with the ongoing crisis of worldwide democracy? In my opinion, it is because social media companies have focused too much on the individual experience and not enough on the public interest. Teams of people consider the UX, for example, but not nearly enough think about the cumulative, public effect of these platforms. This stems from the fact that companies clearly have a duty to the individual user, but it has never been clear what their duty is to the general public. Companies like Twitter and Facebook are often threading the needle with their approach to speech because they are unsure as to what role they should play and whether they have a mandate to operate with the public interest in mind. So let’s be clear: Social media companies have an obligation to protect and promote democracy. Social media should be pro-democracy, and platforms that are taking a hands-off approach to moderation while democracy is in peril are abdicating their responsibility as American citizens.
2021 will bring about a major change in how platforms consider speech online. Regulating speech is about balancing an individual’s right to speech with public concerns. This balance has often been overlooked in discussions of social media as conversations devolve into accusations of censorship. Ultimately, though, these companies must create an environment that promotes both safety and expression, which means that every individual’s speech on a platform must be weighed against its effect on the platform’s community and also the larger information ecosystem. The glaring threat to democracy from an information ecosystem run amok is too great for further inaction. Decisions change as circumstances change, and recent events will alter how we balance these concerns moving forward. The threat to democracy will force a greater consideration of community impact in 2021. Deciding what is appropriate speech is not just about understanding the intent of a speaker, but also the impact of the speech on the community.
In addition, social media companies have been hamstrung by conflicting messages from the public about what their role should even be with respect to moderating speech. I have often said that social media platforms are stuck between a rock and a hard place. Regardless of whether they choose to introduce new content guidelines or to stay the course, they’ll receive ample criticism. Further, platforms are well aware that to assert power over speech invites criticism about their lack of authority to do so. American politicians derive their power to determine appropriate speech from the will of the citizens, who also have the power to vote out politicians they disagree with. The bind that social media platforms are in is that they must assert power in a way that’s reminiscent of the government without actually having been elected by the platform’s users. The immediate danger to democracy that stems from misinformation and calls to violence, however, makes the current assertion of power from these platforms obviously necessary. The clear threat to democracy is a mandate for social media platforms to act in its defense. That means being more aggressive in moderating dangerous posts and misinformation that have the potential to tear this country apart.
Don’t Yell Fire in a Crowded Theater
A popular dictum about the limits of free speech, which paraphrases a Supreme Court opinion by Justice Oliver Wendell Holmes, is that we cannot yell fire in a crowded theater. The action of the individual who yells “fire!” poses too grave a danger to the community, which would likely react chaotically and dangerously to the individual’s utterance. Unpacking this line of thinking illustrates the complexity of free expression in a democracy. Although the Supreme Court’s thinking has slightly changed since Holmes first presented this analogy, the situation is quite illustrative for our present circumstances. The bottom line is that free expression is not and has never been an absolute right. This is why people can curse in the comfort of their home but not in the streets.
The delicate duality at play in this debate is that we are both individuals with our own priorities, beliefs and desires, but we’re also citizens enmeshed in a larger society. We should be able to express ourselves and our beliefs, but this expression is weighed against its impact on the community. So in the “fire in a crowded theater” example, one’s desire to say “fire!” should be hedged by the foreseeable impact that this statement would have on the public’s safety inside the theatre.
Right now, our approach to social media endangers public safety. This threat results from the open exploitation of our hyper-focus on protecting individual rights to say whatever they want. Plus, metaphorically yelling “fire!” in the crowded theater of social media is great for engagement. Social media companies have profited from the chaos, but the current methods for mitigating harm are no longer sufficient.
Right now, we label these “fire!” tweets as potentially false in the hopes that the public might be swayed through intellectual intervention. The problem is that a “fire!” tweet stokes powerful emotions, making an intellectual response almost certain to fail from the outset. Imagine standing at the doors of a theatre, trying to offer a dispassionate lecture about the absence of fire as the people inside are panicking. How effective is such an intervention likely to be?
And guess what? There is now chaos in the proverbial theater, and people are getting hurt. Social media must change. The first step to making meaningful change requires these companies to accept the fact that their platforms impact the health and well-being of our democracy as opposed to merely reflecting society.
Shattering the Myth That Social Media Is a Mirror of Society
People like to say that social media is a mirror of society or a window into people’s true selves. This view is utterly off base and serves as a way for companies to downplay their impact on human behavior and society. Platforms are not merely curating speech. Rather, they are altering human behavior. Facebook knew this back in 2012 when a study of nearly 700,000 users showed that people who saw fewer negative posts in their News Feed were less likely to post negatively. The choices that companies make around speech moderation on their platforms affect the choices and thoughts of their users. Social media, in other words, is not a mirror of society — it is an environment that, like all environments, alters the behavior of its citizens.
It’s time to rethink the role that platforms play. The downfall of democracy must not be an unintended consequence of a private company seeking greater user engagement.