How Do You Launch a Career in Trust and Safety?

Trust and Safety is a vital part of making a better technological ecosystem, but the field lacks a clear entry point and career path. Here’s some advice for getting into this growing area.

Written by David Ryan Polgar
Published on May. 04, 2022
A climber helps another summit a mountain
Brand Studio Logo

The amount of hate speech, misinformation, and other harmful content on social media platforms is a huge problem that threatens nearly every aspect of our current world. Former President Obama just spoke at Stanford University about social media’s problems, bringing his message about the impact of technology to the heart of Silicon Valley. At the same time, an equally passionate group of people question the legitimacy of social media companies' power in determining appropriate speech. So, what does an ideal social media experience look like? And who is best suited for finding this happy medium?

For most Americans, the concept of free expression is so central to our national identity. According to a recent Knight-Ipsos survey of more than 4,000 American adults surveyed about their feelings toward free expression, the “findings show that most Americans — regardless of race, party or age — overwhelmingly support free speech and expression, recognizing its importance to a healthy democracy.” In that same survey, however, only 30 percent of respondents believed that spreading Covid-19 vaccine misinformation would be considered a legitimate expression of First Amendment rights. Therefore, an ideal social media experience is an environment that is conducive to meaningful conversation and our professed commitment toward free expression while also limiting personal and societal harms such as bullying, doxxing, hate speech, fraud, misinformation and disinformation. 

That, of course, is an extremely difficult balance to master. To make matters more complex, the survey authors noted that our commitment to free expression may be colored by our political leanings and that we are “in a polarized environment where partisans disagree on what speech is acceptable depending on the politics involved, and independents’ views differ widely as well.”  So, choreographing this dance between freedom and harm in a time of heightened political polarization is increasingly important in the tech industry. 

What Is Trust and Safety?

Trust and Safety is a growing field tasked with striking the delicate balance between too much censorship and too much harm. As defined by the Digital Trust and Safety Partnership, which is a new initiative focused on promoting a safer and more trustworthy internet, the term itself “refers to the part of a digital service’s operations that focuses on understanding and addressing the harmful content or conduct associated with that service … Each company is guided by its own values, product aims, digital tools, and human-led processes to make tough decisions about how to enable a broad range of human expression and conduct, while working to identify and prevent harmful content or conduct.”

More From David Ryan PolgarSocial Media Companies Don’t Know What People Want

 

Enter Trust and Safety

That’s where the expanding field of Trust and Safety comes into play. This career is tasked with striking the delicate balance between too much censorship and too much harm. As defined by the Digital Trust and Safety Partnership, which is a new initiative focused on promoting a safer and more trustworthy internet, the term itself “refers to the part of a digital service’s operations that focuses on understanding and addressing the harmful content or conduct associated with that service … Each company is guided by its own values, product aims, digital tools, and human-led processes to make tough decisions about how to enable a broad range of human expression and conduct, while working to identify and prevent harmful content or conduct.”

According to the Trust and Safety Professional Association, a global community of professionals who develop and enforce principles and policies that define acceptable behavior and content online, upwards of 100,000 Trust and Safety professionals work in areas such as content moderation, policy enforcement, minor safety, digital civility and more. 

For a bit of perspective on how Trust and Safety has grown as a profession, when Charlotte Willner joined Facebook in 2007 — three years after its founding — she was just one of a handful of individuals at the company focused on reviewing content.  The fast-growing platform had yet to codify its rules, which were then just one page long. Today, however, Trust and Safety teams at major platforms discuss their principles and values, have lengthy terms of service and community guidelines, and often release transparency reports providing a glimpse into their content moderation actions.

Facebook was not atypical of other startups at the time, as Trust and Safety was not a widely-known field and the traditional startup founder earning a computer science degree would not have been exposed to this type of learning in the classroom. As Stanford’s Cyber Policy Center points out, “One of the root causes of the failure of the technology industry was a lack of intellectual scaffolding and educational content aimed at the leaders of large companies and new startups alike.” 

That problem has largely changed in recent years, as growing concerns around the impact of social media on mental health, civility, and the future of democracy have correlated with an expansion of Trust and Safety roles and the “intellectual scaffolding” supporting the field. Today, Willner is the executive director of the Trust and Safety Professional Association, Stanford has a Trust and Safety Engineering course and a new Journal of Online Trust and Safety, and this September will feature the First Annual Trust and Safety Research Conference.

Similarly, you can find a wide range of tech companies hiring roles like these under the umbrella of Trust and Safety:

 

Who Does Trust and Safety?

But who works in Trust and Safety, and how do you enter the field? 

I’m the founder and director of the nonprofit All Tech Is Human, which aims to grow the responsible tech ecosystem and also a member of TikTok’s Content Advisory Council. Both of these roles mean that I’m heavily involved in the often politically perilous world of Trust and Safety. I see new positions cropping up daily intended to create a better social media experience and have seen the types of people who thrive in the field.

People who tend to do well in Trust and Safety are often hybrid characters whose backgrounds meld together multi-sector experiences across policy, engineering and product management. They have successfully merged their technical understanding with an appreciation for the nuanced trade-offs involved with limiting harms in digital spaces. Success here involves merging the fuzzy (i.e., humanities and social sciences) with the techie, placing equal weight on both “tech” and “society.” 

After interviewing dozens of people working in Trust and Safety for projects and reports related to All Tech Is Human, it has become apparent to me that those in the field have an ability to work cross-functionally, traversing multiple departments. Often, these Trust and Safety professionals pointed out a “non-linear career path” that led them into the field. I realized that a non-linear career path, one that was multi-sector and often multidisciplinary, was less of an outlier and more of a clear advantage for individuals who were now being rewarded for their broad backgrounds. Reading over dozens of job descriptions in Trust and Safety, a clear trend is that tech companies are looking for people who are comfortable engaging with multiple departments. 

You’ll find the emphasis on comfort at working with multiple departments and stakeholders to be a key feature in many Trust and Safety roles. YouTube, for example, points out that working in Trust and Safety at the company is a “cross-functional role that’s at the heart of YouTube.” The job description goes on to state that the employee will “work across nearly every department at the company. That means partnering with members of our legal, operations, public policy, product management, and engineering teams to solve problems and develop innovative ways to combat harmful content.”

More in Career DevelopmentTech Hiring Madness! The ‘Elite 8’ Skills to Look for in Recruiting.

 

Getting a Start in Trust and Safety

Finding an initial pathway into the field of Trust and Safety tends to be the most difficult barrier. If you want to be a UX designer, for example, a clear path into that discipline as a career exists. The necessary education is clear, and the role has defined responsibilities. Trust and Safety, on the other hand, is still relatively nascent as a field; best practices, job descriptions, desired education, and typical career trajectory are nebulous. Nonprofits such as the Integrity Institute, Trust and Safety Professional Association (TSPA), and the Oasis Consortium have all popped up in recent years to provide access to training, mentorship and a greater sense of a connected ecosystem. 

My advice if you’re looking to get into Trust and Safety is to tap into the engaged and supportive communities that are cropping up in the field and start boosting your credentials. Take online classes, such as this recently-launched LinkedIn course and the TSPA’s Trust and Safety Curriculum. Instead of viewing a non-linear background as problem in need of explanation, recognize it as a valuable asset that widens your lens for seeing different perspectives. Go into the field clear-eyed, as it is loaded with political landmines and the risk of incredible stress from witnessing the type of harmful content that others should never see. But the reward, on the other hand, is building a better online experience. 

Explore Job Matches.