2020 has come and gone, and for all the hardship and trauma it left in its wake, the silver lining for me was having the curtain pulled back on the underlying systems that determine much of our day-to-day lives. The George Floyd protests coupled with the inadequate handling of the pandemic illuminated how flawed our world is and how much room we have to improve. Although many people were aware of this fact before 2020, it’s not a stretch to say that the idea that human biases pervade everything around us became clearer than ever before. Even those “objective” disciplines that we’ve held as nearly sacred, like science and technology, clearly exhibit the same biases we see elsewhere. But how could that not be the case when the reality is that these systems, like much of the world we interact with, were built by humans who were no less flawed than any others?

How did we end up with platforms or products that interfere with global geopolitics? Who do we blame when a piece of software isn’t working for people of a certain gender or race? The tech world has been forced to reckon with the opening of this particular Pandora’s box; is it possible for us to now force it shut? And, if we do manage to get a grip on the chaos, who bears the responsibility for ensuring we don’t make the same mistakes again?

We could easily point the finger at individual engineers, designers or even CEOs and accuse them of malintent. Identifying a villain helps us simplify a complex problem and feel like we’ve done our part if/when they face consequences. But the reality is much more complex. As we saw in so many ways last year, the system is broken.

 

The Ethics of Technology

When you look at a website or an app, it’s easy to feel like it’s a polished product made by a team of professionals who carefully weighed every option and possible outcome. The reality, of course, is that it’s more likely to be the work of a single engineer working with a single UX designer (if they’re lucky) making a series of decisions they thought were right at the time. Sometimes, they may even have known they weren’t making the right decisions for the long-term life of the product. Nevertheless, they did it anyway, naively thinking, “We’ll go back and fix that in a few months.” Algorithms mirror and, in many cases, magnify the thought patterns of their creators. And given the demographics of software engineers (who are overwhelmingly straight, white, cisgender males), it shouldn’t be surprising that most modern technologies feel like they were made for that very group of people, with everyone else an afterthought.

Even beyond gender and racial biases, we have to consider the impact that business decisions have on our societal well-being. Movies like The Social Dilemma have exposed laypeople to issues underlying the tech sector that have been common knowledge within tech circles for years or even decades. Tech products and the companies that make them have goals that are often fundamentally misaligned with mental wellness. “Doom scrolling” on apps like Instagram is good for advertisers, but it’s almost certainly a contributing factor to increased rates of anxiety and depression in the past decade, especially among young people. “Doom swiping” on dating apps encourages similar behavior, counterintuitively making not going on dates addictive. The users on dating apps who swipe endlessly with zero intent of talking to, much less meeting, the people they match with — these users are good for dating app companies. They present a low risk of churn, or leaving the app, while simultaneously disrupting the experience for other users. It’s a self-perpetuating cycle.

So how do we approach the systems that have made the tech industry what it is today? I can’t pretend to have all the answers, but there are some small steps we can start to take to make technology work for all people instead of a select few.

 

Companies and Product Builders

Honesty Over Innovation

Look, innovation is great. I could nerd out over tech advances for days, and I’m not saying that breakthroughs aren’t important. I just think that making innovation the primary cultural pillar at a company allows morals, ethics and honesty to fall by the wayside. We’ve seen this happen time and time again, from Theranos to Volkswagen. People wonder how these types of companies can get so far on a lie, but it’s not hard to see how this could happen when innovation or even just the appearance of innovation takes precedence over everything else. Founders in particular have the ability (and, some might argue, moral obligation) to embed this ethos from the very beginning, which lays the foundation for everything that comes next.

Personal Responsibility

Once an organization establishes a culture of honesty, the engineers and designers who make the day-to-day decisions about a product must assume more personal responsibility. It can be hard to see the potential long-term effects of individual decisions from the ground level, but playing out these scenarios on a regular basis is a moral imperative when each choice can easily impact millions or billions of people. Managers and teammates alike must exercise empathy for both the users of a product as well as the product builders who flag potential issues. Too often, those who try to sound the alarm are overlooked or even punished for their efforts because of someone else’s fixation with short-sighted personal outcomes.

Get Creative (Like, Actually Creative)

Even as we laud the tech industry for its seeming abundance of creativity, companies fixate on building businesses atop the same broken models. Ad-focused revenue models, though clearly lucrative, rarely align with the best interests of the people interacting with them. Subscription-based models make investors giddy but can also feel predatory in certain implementations. Limiting the scope of “investable businesses” to just these two frameworks is the result of a lack of imagination. At the risk of coming across as Pollyanna, I really do believe there are other untapped models out there that can align business incentives with customer incentives.

Employ Diversity at the Highest Levels of the Company

It might be hard to believe, but some people still recoil when they hear that maybe they should start hiring more diverse teams or promoting a diverse group of team members to higher-level management and executive roles. The biggest concern that gets repeated is that hiring for diversity lowers the overall quality of the workforce since, these people argue, you’re making exceptions for lower-quality candidates. The reality is that unraveling all the myriad biases people employ when screening candidates is a difficult, time-consuming proposition that we need to actively implement now. Plus, it’s good business. Employing the very people who belong to a community that a company is trying to reach means a brand will be perceived as more credible, genuine and trustworthy. All of those qualities will translate to better sales.

 

Users & Consumers

Educate Yourself and Understand Your Role

Businesses are, by their very nature, driven by profit. While this isn’t unique to tech companies, the profit incentives become a lot more personal when tied to the collection of human data. Preferences, ideas and even subconscious inclinations once kept private now fuel some of the biggest industries in the world through digital ad sales. Companies build addiction into products in an effort to increase time spent on a platform, which then translates to increased ad revenues. Consumers obviously shouldn’t shoulder the burden of company and product decision-making, but they can work harder to understand the business models and teams behind a product or service before using it. Taking small steps can help shift business models when implemented by the masses. Valuable practices include turning off notifications, unplugging at certain times and supporting products and businesses that align with your values in a genuine way.

 

A Better Future

I’ve heard for years now that the tech industry is facing its reckoning. I’ve likewise regularly heard that something has to be done not just to stop the spread of misinformation, but also to scale back humanity’s addiction to screens. I, like many others, have shrugged off these calls to action in the past because it never felt like anything could be done. But if there’s one thing we can collectively gain from 2020, hopefully, we put away that defeatist attitude and work toward real, tangible solutions to our problems. I’m cautiously optimistic that 2021 could finally be the year we chart a new course.

More on Tech EthicsSocial Media Companies Must Act to Protect Democracy

Expert Contributors

Built In’s expert contributor network publishes thoughtful, solutions-oriented stories written by innovative tech professionals. It is the tech industry’s definitive destination for sharing compelling, first-person accounts of problem-solving on the road to innovation.

Learn More

Great Companies Need Great People. That's Where We Come In.

Recruit With Us