How AI Is Diversifying VC Funding

Venture capitalists are taking a step toward eliminating unconscious bias from their decision-making with this technology.

Written by Barb Hyman
Published on Mar. 08, 2024
A close-up of a propped-up iPad with “AI CHAT” on the screen and a business person sitting across from it.
Image: Shutterstock / Built In
Brand Studio Logo

California has adopted a new law requiring venture capital firms to collect diversity data from portfolio company founders. Many government agencies and countries, however, have already been measuring diversity for years — with little impact. Recent data highlights the stark disparities when it comes to underrepresented groups in venture capital funding.

Statistics on VC Funding for Minority Groups

In 2023, companies founded solely by women received just 2 percent of total capital invested in venture-backed startups in the U.S. In Australia, a staggering 97 percent of Series B investments during Q3 of 2023 went to all-male teams, with disparities evident even at earlier funding stages. And Black founders raised just 1 percent of all venture capitalist funds in 2022.

More from women leadersWhat It Takes to Lead a Unicorn

 

How Bias Limits Opportunities for Minority Founders

Investing in a startup is a risky decision made with little data. Usually there’s no company track record to take into account and, at seed stages, there isn’t even a product to test and evaluate. It’s a big, gut-driven gamble.

Venture capitalists must rely only on the information they do have: Did the founder work or go to school somewhere impressive? Have they successfully sold a product before? Do they match the profile of other successful founders? Answers to questions like this are helpful shortcuts in decision-making. But they’re also the perfect example of how the road to bias is paved with thoughtful questions.

Human decisions are inherently biased, and so is the VC decision about whether to invest in a specific founder. Moreover, some criteria are biased without seeming like it on its face. For example, many VCs won’t invest in a first-time founder. They believe founders need to make mistakes before figuring things out. While that may not sound particularly biased, it severely limits the founder pool. How many women are two- or three-time founders? How many Black founders?

 

What Is Blind Screening?

Now, with AI, venture capitalists can use blind screening to help them make unbiased assessments of  founders, eliminating subjective judgments that have historically disadvantaged entrepreneurs from underrepresented groups. Through this approach, venture capitalists can focus on identifying essential traits for success, such as resilience, strategic vision and adaptability.

By conducting AI chat interviews, VCs can focus on plain text — one of the most fair and reliable ways to get information about a person’s behavioral traits and overall ability — without being influenced by candidates’ appearance, race, gender or anything else that could lead to bias.

AI-driven analysis can help ensure that funding decisions are based on merit and potential rather than preconceived biases, especially unintentional ones. AI vetting can also put an end to mirror investing, which is when decision-makers copy the strategies of successful investors, potentially copying their biases in the process.

AI chat interviews can pave the way to a more equitable and inclusive funding landscape. And increasing diversity is ultimately good for the bottom line. A recent study of public companies found that diversity in the boardroom was linked with better financial results.

More on Artificial IntelligenceExplore Built In’s AI Coverage

 

 

Challenges of Adopting AI in Investment Spaces

One hurdle to adopting this process is the belief that automation diminishes the human experience. As chief executive officer of an AI recruitment platform, however, I know first-hand that trustworthy AI must prove that it is unbiased, valid, explainable and inclusive.

This is why, for example, we have a detailed technical manual that provides insights into how our machine learning works and have published some of our core research in peer reviewed journals — to maximize transparency, explainability and fairness. It is also why we have formulated a framework for implementing fair AI in recruitment.

Additionally, AI development teams must ensure users’ data privacy and security, organizational team diversity and transparency. Bias testing must be broad and continuous, and models should be explainable and rule-based, ensuring a human is in the loop right from the outset. It enables the user to know, and share internally, exactly what the technology is looking for.

This is different from the standard approach, which involves building a machine learning model from a historical dataset. Bias testing against a norm group can reveal potential biases of using the rule set, removing the risk of going live with a biased model.

How to support Black-owned businessesBlack Tech Unicorns Are Rare. Here’s How to Change That.

 

Commit to Unbiased Screening

VCs will have to be committed to a culture of data-driven decision-making, and AI can add more data points to the dataset. VCs, as investors of other people’s money, have a fiduciary duty to de-risk their decisions using quality, objective data to ensure the best investment decisions.

I predict the first VC to mainstream anonymous screening of founders will see increased limited partnership money flowing in. AI can help make less biased decisions. It can also help make more money. Everyone wins — more.

Explore Job Matches.