Not All Tech Problems Are Engineering Problems
The tech industry is the target of no small amount of angst these days. The Social Dilemma, a documentary featuring former tech insiders now on a path to moral redemption, is one of the top films on Netflix. It seems like everyone right now is angry at Big Tech and worried we may be headed toward a dystopian future. But why?
People like to think that this “techlash” is about the general public being angry with the technology itself. This view suggests that the problem can be solved by simply improving the quality of the industry’s products. That if algorithms are biased, we need to remove that bias. And that if facial recognition systems are faulty and riddled with errors, we just need to improve their accuracy. Thus, many people in the industry define the problem as an engineering challenge to be solved. Upgrade the tech, alleviate the angst.
The problem with this view is it fundamentally misunderstands the issue. “Techlash” isn’t about customer dissatisfaction. It’s about power.
This came into clear focus last month when a media firestorm erupted around U.K. schools that relied on an algorithm to make the score determinations for the important A-level exams after in-person exams were canceled due to COVID-19. All students in the United Kingdom are required to take the A-level exams if they would like to attend university, so a low score on the exam could derail a student’s academic future. After the test results came in, critics leveled accusations that the algorithm was biased and perpetuated social inequalities. The headline for a Wired article captured the feeling succinctly: “An Algorithm Determined U.K. Students’ Grades. Chaos Ensued.”
If your takeaway from the public’s outrage at this is that we need to work on solving algorithmic bias, you are missing the larger reason why this situation is such a concern for the general public. The anger isn’t about the algorithm’s faultiness. It’s about the power given to the algorithm in the first place. It’s about the powerlessness that people feel in determining their own futures. In this scenario, power was shifted from a relatively transparent system that allowed for checks and balances by having multiple teachers who are experts in the subject matter make determinations to a system that feels opaque. This decision took a massive amount of control away from individuals, and they felt vulnerable because of it.
In other words, public concern around the proliferation of digital technologies, the ubiquity of social networks, and our increasing reliance on artificial intelligence is less about the technology and more about the massive power imbalance between innovators and those affected by their work. For example, a social network has the power to determine what information we do or don’t see, along with being a major conduit for forming friendships and mediating communication. The modern social network has become less of a tool that an individual uses and more like the curator of that person’s reality. That’s a lot of power.
For the user, that dynamic means that the social network holds the power to alter their future, perhaps even reshaping their overall identity and human experience. Approaching a contentious U.S. presidential election, social networks also play a massive role in controlling misinformation, disinformation, and potential calls for violence. The reason why people affected by social networks are so concerned right now is not because of the underlying technology, but because social media is dramatically affecting our future without our say. People feel vulnerable to the whims of unelected leaders. Social networks are becoming intertwined with the future of democracy, so it stands to reason that people want a voice.
That is a big deal.
The tech industry constantly overlooks just how big of a deal this shift in power is by designating all these concerns with the vague umbrella term “tech issues.” In 2020, every time we talk about technology, we’re really talking about power. All tech is human, which means that we need to view it through a sociotechnical lens. We can’t divorce technology from how it both impacts and is impacted by human behavior. So, as the developer of any digital technology, your responsibility will increasingly entail not only considering unintended consequences, negative externalities, and downstream effects, but also how to alleviate any massive power imbalances between your product and its users.
The revolt against technological disempowerment is already happening, and users will increasingly demand to be treated as citizens. We’re already seeing calls for explainable AI and the creation of an appeals process for decisions made by artificial intelligence. Similarly, social networks are developing more appeal processes, such as Facebook’s in-house “Supreme Court,” formally known as its Oversight Board. The tagline for the Oversight Board is, “Ensuring respect for free expression, through independent judgment.” The Oversight Board has the power to bind Facebook with their independent decisions around the content that can or cannot be on the platform. The message is clear: Power needs to be less consolidated in the hands of any one company as the decisions that are made at organizations like Facebook regarding what type of speech is allowable and protected have massive consequences for the lives of billions of people.
Technology Is Not Just a Tool and Individuals Are Not Just Users
The tech industry often says that technology is just a tool. It’s all about how an individual uses it. You have likely heard the analogy that compares technology to a hammer, and each user makes decisions regarding what they do with it. The gist is that technology, like a hammer, is value-neutral. The hammer can be used to build a house or bludgeon another person, but it’s not the hammer’s fault if it becomes a murder weapon. That action stemmed entirely from the behavior of the user.
This analogy is seductive in its rationale — the idea that our work is ethically neutral prevents having to grapple with many hard questions — but deeply flawed. Technology is not a hammer. Once someone buys a hammer, the individual assumes total responsibility for determining how best to use it. The technology I am referring to, on the other hand, is not a tool we are simply using. While a hammer doesn’t nudge a user toward a particular path, our technology does. And while a hammer doesn’t change its shape to conform to the person using it, our technology is hyper-personalized around the background and possible future of its user. Digital technology, artificial intelligence, and social networks are simultaneously influencing us and being influenced by us. Technology is us.
By thinking about technology as a relationship between a product or platform and its users, we are incorrectly framing the scenario as a one-way exchange. The case of selling someone a hammer is a clear one-way relationship. Once the hammer is sold, the interaction between the company and the buyer is severed. The individual uses the hammer as they see fit with no further interaction with the company necessary.
Technology, on the other hand, has become intertwined with our very existence. As a result, we will likely soon start treating tech companies in a similar vein to our political bodies. Tech companies, by their sheer importance in shaping our future, will take on the trappings of elected representatives, requiring greater oversight and expanded bureaucracy. We can already see this dynamic playing out in the way people speak of their First Amendment right on social media. Of course, the actual language of the First Amendment states that “Congress shall make no law ... abridging the freedom of speech.” Has Facebook become Congress? People are relating to social media companies as citizens do to their government, with the companies taking on the power and regulatory role that traditionally resides in governmental bodies.
We have claimed for years that social media is the new public sphere without also noting that public spheres entail a significant amount of negotiation around individual rights and expression. For example, our governmental enforcement of speech is not supposed to be arbitrary and capricious. Laws should be applied evenly, no matter who the speaker is. Checks and balances are baked into the system to offer oversight on the fairness of speech moderation. That’s not the way social media platforms were set up, but it is the direction they’re headed. Terms of Service and Community Guidelines are the equivalent of laws for social media. The growing pains we are witnessing derive from citizens demanding their online public sphere be treated in a similar fashion as their offline one. That makes social media companies like the government, and its users are now the citizens. Technology is democracy.
No Application Without Representation
When we think about the events leading to the American Revolutionary War and the founding of American democracy, we often think of the Boston Tea Party and the colonists revolt against the British Parliament’s passage of the Tea Act of 1773. “No taxation without representation,” went the slogan. It felt offensive to the colonists to have their lives dramatically impacted without the recourse of self-government. They wanted more power and control over their lives.
Enter our current moment in tech. Similar to the colonists’ revolt, which was about power and recourse, today’s pushback is about feeling that important decisions are being made about us but without us. Every piece of technology that is developed and deployed today is no longer operating on a product-to-a-user relationship, but more similar to a government-citizen relationship.
The technology that we are developing and deploying is now intertwined with the future of our society. What the public is saying is, “No application without representation.”