Whats the first word that comes out of your mouth right after you painfully stub your toe?

I’m guessing it’s not appropriate enough for polite company. In the heat of the moment, you hurl your go-to curse. Luckily for your reputation, only your cat hears your choice words uttered in agony. Your reactive self, the part of you that is thoughtless and impulsive, is not made public.

Social media platforms are filled with similar reactive outbursts, along with the sharing of questionable content and trollish behavior that thrives in an environment that rewards quick communication. Social media often takes advantage of our reactive selves rather than more reflective, considered thoughts.

If we designed a platform, for example, that recorded your first utterance after stubbing your toe and then uploaded it let’s call it StubYoToe we wouldn’t be surprised if the platform quickly filled with objectionable, offensive content. With its emphasis on removing friction and grabbing immediate reactions, the platform would be full of falsehoods (e.g., wrongly blaming your nephew for leaving Legos on the floor) and f-bombs. And as a free ad-based platform, we should also not be surprised that StubYoToe would put a premium on the ease of use by automatically uploading your outbursts since a greater level of engagement directly equates to more revenue. The site would be both wildly profitable and wildly toxic at the same time.

The days of a friction-free social media ecosystem that maximizes the volume of communication while putting limited resources toward trust and safety are over. A confluence of factors have created a perfect storm of social media disruption, including:

Taken together, these collective movements add up to a changing landscape for social media platforms. Right now, there is growing concern around social media’s impact on user well-being, civil discourse, and democracy writ large. As this downsides increasingly become part of social media’s responsibility, it is bound to change the back-of-the-napkin cost/benefit analysis for how they operate. For example, think about how large civil lawsuits against companies often incentivize others to improve their products since fewer lawsuits equals more profits. Similarly, the likely future for social media companies will include much higher penalties for their perceived societal harms. Right now, we would penalize a factory for polluting the air and water in order to reduce future pollution. In the future, damages from social media will be treated like pollution. This means that social media companies will have a much stronger incentive to reduce their initial harms, which is why they will need to incorporate more friction in their process. No longer will more communication on a platform equal more profit if that platform also has a substantial amount of their revenue subtracted from their balance sheet because of penalties from unmoderated content.

I write this from firsthand experience, as my career puts me at the intersection of big tech, policymakers, content moderation, media, startups, and our evolving relationship with technology. I currently sit as a founding advisor on TikTok’s content advisory council and am the founder and president of a responsible tech organization called All Tech Is Human. Working with a diverse range of stakeholders has offered a window into the current problems social media platforms face and how we might be able to create a better tech future. For social media, a better tech future requires innovative user friction to reduce the spreading of misinformation, cyberbullying, doxing, hate speech, and other bad behaviors. Not only will it be the right thing to do, but it will also be the more profitable path for social media platforms. In order to be more profitable, however, platforms will need to incorporate FrictionTech.

More on Social MediaInstagram AI Tools Are Frowned Upon — but They Work

 

What Is FrictionTech?

FrictionTech involves methods and tools that tilt users toward more reflective forms of interaction. Similar to nudging in choice architecture, which has become a major design method for platforms, FrictionTech aims to put small hurdles into online interactions in order to prevent damaging reactive communication and behavior. Two recent examples of major platforms using FrictionTech include Instagram’s attempt to reduce cyberbullying and Twitter’s foray into slowing down the spread of misinformation.

For Adam Mosseri, the CEO of Instagram, reducing cyberbullying means taking down offensive comments before they’re posted. Content moderation, which is a mix of human moderators and AI systems using natural language processing (NLP), is more successful when there is less offensive material in total on the platform. That’s why Instagram announced last year that they have been testing out an AI system that scans messages for potential acts of cyberbullying and then presents the individual with a form of friction: Are you sure you want to post this? The individual is not prevented from sending their message, but the slight delay may encourage the person to be more reflective and cognizant of how their words may affect the intended recipient and, ultimately, the overall information ecosystem.

“This intervention gives people a chance to reflect and undo their comment and prevents the recipient from receiving the harmful comment notification,” Mosseri said last year, explaining Instagram’s new tool for reducing cyberbullying. “From early tests of this feature, we have found that it encourages some people to undo their comment and share something less hurtful once they have had a chance to reflect.”

In a similar vein, Twitter has been searching for ways to curb the spread of misinformation. Although we often think that misinformation stems from bad actors and nefarious bots, a major part of the misinformation problem on social media comes from regular users sharing misinformation uncritically. That’s why Twitter is incorporating friction into the process  moving a user from reactive to reflective  by trying out a new feature that asks, “Want to read this before retweeting?” Just like Instagram’s form of friction, a user isn’t prevented from communicating, but they do have an additional step that will possibly alter their behavior.

Speaking to Consumer Reports, which reported on Twitter’s new feature in June, a company spokesperson shared their rationale for this update: “In an effort to help start healthier conversations we want to help people know what they are sharing. So when someone is about to retweet an article but they haven’t clicked into the linked article, they’ll see a prompt asking if they’d like to open the article before sharing.”

In the past, Twitter would have wanted more total retweets and Instagram would have wanted more total messages. But the equation for social media has changed because frictionless communication has clearly created substantial problems with cyberbullying and misinformation. Social media companies know there will be severe financial harm connected with inflicting societal harms. That’s why the future of social media depends on friction: finding innovative ways to nudge people away from their impulses and move them toward a more intentional form of communication.

More on Social Media11 Things Marketers Need to Know About TikTok

Expert Contributors

Built In’s expert contributor network publishes thoughtful, solutions-oriented stories written by innovative tech professionals. It is the tech industry’s definitive destination for sharing compelling, first-person accounts of problem-solving on the road to innovation.

Learn More

Great Companies Need Great People. That's Where We Come In.

Recruit With Us