Although a wide swath of society may adopt a piece of technology, such products are typically developed by a small number of people. This dynamic presents a major problem when tech companies make incorrect assumptions about the mood and needs of the general public. Virtual reality, for example, has been the “next big thing” since I was in high school. More recently, Sergey Brin laid out the case in a 2013 TED talk about why society was going to move away from staring at our smartphones and instead adopt Google Glass and other forms of augmented reality. Nearly a decade later, the masses are still staring at smartphones. In my opinion, a large disconnect exists between the tech future envisioned by a Silicon Valley founder versus what someone on Main Street, USA actually wants.
For example, will the general public get as excited about the metaverse as Mark Zuckerberg seems to be? Personally, I think the metaverse is massively overhyped and showcases the disjunction between the desires of the creator class and the people who will actually adopt the products. Facebook’s recent name change to Meta, however, suggests it’s confident that the currently-niche metaverse will eventually go mainstream. But does Meta truly understand what people outside its offices want? After all, success in Silicon Valley depends on aligning your products with public demand.
As the founder and director of the nonprofit All Tech Is Human, which unites the public, government and industry to tackle thorny tech and social issues, I have developed a simple maxim that applies whenever a tech company aims to improve our tech future: You can’t align technology with the public interest unless you understand the public interest.
Aligning Social Media With Public Interest
Let’s take the example of Instagram, which has been trying to deal with the growing concern around its effect on wellbeing, particularly for young people that must navigate feelings of inadequacy from the constant comparisons of others. Parents and policymakers are upset with Instagram because they feel that the company isn’t doing enough to prevent children from being on the platform.
According to Instagram’s terms of service, individuals must be 13 years of age or older to join the platform. The age cut-off, which is standard across social media platforms, is designed to avoid the enhanced scrutiny that platforms with underage users face, as well as restrictions around collecting children’s personal information. The Children’s Online Privacy Protection Act, enforced by the FTC, looms large for platforms that might violate its rules, making them subject to substantial fines. This was even a major plot point in HBO’s Silicon Valley.
Instagram, then, faces a lot of pressure to ensure that children are not on its platform. As a solution, the company created a youth-focused platform called Instagram for Kids designed to keep young people off the main app. Instagram’s “solution,” although it would likely dramatically reduce children on its original platform, was not satisfying to the public. A successful answer must truly understand the public’s anger and what success would look like from its perspective. This difference between how Instagram and the public perceives the underlying problem is why, in my opinion, Instagram for Kids was met with such hostility.
What Does Success Look Like?
Let’s unpack the problem’s underlying details. As we’ll see, the issues plaguing social media are incredibly complex and nuanced, with competing values and viewpoints. But these debates are marked by a tendency to flatten everything into a one-dimensional issue typically focused on incentivizing a group to act as opposed to unpacking a complex issue in order to best determine how to act. This is based on the assumption, which is wrong in my opinion, that most of the problems facing the tech industry (e.g., misinformation, hate speech) stem from a lack of concern from the industry.
In my experience, the bigger issue is that the tech industry does not have a clear understanding of how to act. In this scenario, Instagram was compelled to act — and they did — but they never truly understood what would satisfy the needs of the public.
Right now, parents are upset that social media platforms aren’t adequately enforcing their 13-plus rules. Tech companies are themselves angry at some parents who knowingly allow their underage children on the platforms, thereby breaking the rules. Companies also struggle with the fact that children can easily lie about their ages. Policymakers are angry at tech companies and their lack of progress in reducing the number of children on major social media platforms, but the companies feel that they’re being asked to confront issues that have no clear technical fix and therefore fall outside their purview.
For instance, age gates aimed at preventing underage individuals from joining a platform are far from a panacea and are riddled with trade-offs, typically around privacy. In the real world, if minors try to buy alcohol, a physical gatekeeper prevents the purchase by verifying their age. But moderating digital spaces requires finding a scalable solution. Facial recognition, which universities have employed to reduce cheating on remote tests (itself leading to significant pushback from students), could work, but is that what the general public wants?
Because lying about their age isn’t difficult for kids, platforms like Roblox are testing out a new age verification method. To use the Roblox voice chat feature, new users must take a selfie with their government-issued ID card. The difficulty with Roblox’s plan, which is the trouble that every platform faces, is understanding the limitations of developing a technical fix to a problem that involves human behavior and politics. None of today’s widely-discussed tech problems, such as reducing hate speech, misinformation, and the number of children on 13-plus platforms, can be solved merely through a technical solution.
The issues that social media platforms are struggling with are extremely nuanced and complex. These problems can’t simply be solved by the actions of one stakeholder group like better regulation from policymakers, safer environment from platform, digital citizenship curricula in schools. In the case of children under the age of 13 on Instagram, the dilemma has been reduced to this: There shouldn’t be children on Instagram, so Instagram needs to prevent children from being on their platform. If only Instagram cared more, there wouldn’t be a problem.
Unfortunately, improving social media is not that easy.
Better Engineering Isn’t the Answer
When Instagram’s plans for Instagram for Kids were announced in the fall of 2021, the new platform was immediately skewered, and the company was forced to pause the rollout. Despite Instagram’s emphasizing how easy it is for kids to lie about their age and skirt age gates and how the proposal would provide a safer online experience, the public was unimpressed.
Instagram had solved a problem, but the solution wasn’t satisfying because the public’s unease with Instagram goes far deeper than simply the proliferation of children on the main platform. Concerns about social media’s impact on wellbeing, heightened with the release of internal company documents by whistleblower Francen Haugen, erode the public’s trust that the company would place the best interests of children over its need to expand its user base and profits.
In a vacuum, Instagram for Kids is an effective technical fix. In the real world, however, Instagram for Kids misunderstands the public’s true problems with the platform. Instagram thinks that the fix is about reducing children on their 13-plus version of the platform, whereas parents largely just want the company to better safeguard its platform and not be involved in their children’s lives.
How Can Social Media Companies Improve?
Social media companies need to be more in tune with the public interest. When I approach thorny dilemmas involving tech and society, I use a simple three-part process that helps illuminate a better path forward.
1. Fully understand the complex problem
Fixing social media’s issues requires involving multiple sectors and disciplines that can illuminate the underlying, complex problem and point toward ways to improve the current situation. In the HX Report: Aligning Our Tech Future With Our Human Experience, we at All Tech Is Human examined the interplay of areas such as law, ethics, psychology, and design with the underlying premise that “fixing tech” requires actively including perspectives far outside your traditional technologist. Technologists tend to propose technical fixes, but the problems of social media encompass education, human agency, parenting, mental wellbeing, business models [most platforms rely on advertising], regulations, and many other areas. Society isn’t solvable through engineering fixes.
2. Recognize the Trade-Offs
No perfect solution exists because the general public massively disagrees about how to weigh competing interests. In order to better align our tech future with the public interest, we, as a society, need to understand that every solution to a problem is a value judgment on how we weigh these interests, especially around safety and security.
For example, when Apple announced in August 2021 that it was incorporating enhanced child sexual abuse scanning, advocates focused on reducing online abuse cheered while privacy advocates worried about what this new level of scrutiny might bring about. Apple has since removed any mention of this feature on its website, as the company was caught flat-footed for not properly recognizing the trade-offs involved with its proposed solution.
3. Co-Create an Ideal Tech Future
In order for the public to be happy with the destination, it needs to have a roadmap and vision for what an ideal tech future looks like. The process of developing that vision needs to involve a broad range of stakeholder groups. Social media, for example, relies on wide adoption from young audiences but does not typically include the viewpoints of youth in their design process or content moderation decisions. This, eventually, will have to change.
In terms of Instagram for Kids, the reason why the public pushed back so aggressively is because the company’s vision of success (fewer children on Instagram because they are moved to a safer space for those under-13) is not the same as what the public has in mind. In other words, Instagram thinks it is succeeding by moving children off of its main platform, but the public is unsatisfied because children would still be on social media and affiliated with a company they may distrust.
Instagram, along with other social media platforms and digital technologies, deeply impact people’s lives. That’s why there are so many discussions around ways to improve the tech industry and lessen its harms on society. But in order to improve tech, tech needs to understand society.