Emtrain Wants to Make Bias Training More Effective With Data
A white man in a gray hoodie is standing alone in an office conference room, eying the phrase written on a whiteboard: “Black Lives Matter.”
His face subtly shifts from a look of bemusement to a kind of irritated determination as he reaches for the red marker in the tray below. He pops the cap, draws a line through “Black” and scrawls “All” underneath it.
It’s a scripted scene — written and filmed by the in-house production team at Emtrain, a human-resources compliance company that makes videos and learning modules for harassment, bias, diversity and ethics training. But it’s a scene that’s all too easy to imagine playing out in the real world, and something that employers need to address head on.
“We try to create a safe space and framework to unpack the issues that are really tough to talk about.”
“We try to create a safe space and framework to unpack the issues that are really tough to talk about,” said Janine Yancey, a former employment lawyer and the founder of Emtrain.
The company prides itself on not shying away from ripped-from-the-headlines topicality, but also aiming to quantify employee sentiment and compliance — at both a micro, company level and a macro, industry level — as they relate to such sea changes.
In mid-July, Emtrain released the results of a survey that gathered 20,000-plus responses at 145 companies in the wake of this year’s twin watershed events: the protests for social justice and the beginning of the pandemic. Compared with results from a prior survey, they found a 7 percent increase in people reporting they feel compelled to minimize their heritage or personal identity in order to fit in, among other red-flag findings.
The shift should be “a wake-up call,” Yancey said at the time in a release. But the fact that Emtrain is able to quickly glean such a pulse check — and the generally high response rates that inform them — is a sign of operational health for Yancey.
“It’s exciting to see the level of engagement and the level of really qualitatively unique data that we’re getting,” she told Built In.
Bringing Analytics to Conversations About Bias
Emtrain brings a pronounced data analytics approach to an aspect of human resources — harassment and bias training — that traditionally hasn’t been as data-forward as other aspects of the function, like employee sentiment, employee retention and — in organizations that still consider it HR’s domain — recruiting.
Here’s how it works: Clients show employees a video that might draw an emotional response while also giving context to the topic being addressed. They’re not always so explicitly of-the-moment; most focus on more commonplace transgressions. One revolves around a thoughtless coworker who deadnames and misgenders a trans colleague, another focuses on an employee who games company policy to expense her internet bill.
Viewers then gauge the nature of characters’ behavior on a color spectrum, ranging from green (respectful) to red (illegal).
“Your version of harassment might be different from my version of harassment.”
“Your version of harassment might be different from my version of harassment.... If you actually unpack that into individual indicators, then you can start to quantify employee sentiment around those indicators,” Yancey said.
Depending on how clients choose to administer Emtrain’s tools, the training could also include some courses and micro-lessons completed on mobile devices. The total range of reactions are logged in a database, and that information is then aggregated into client and team reports — ones that hopefully spot trends as they’re emerging. It’s also used to build benchmark industry reports like the one mentioned above. The company’s clients include BuzzFeed, Square, Yelp, Netflix and the New York Times.
Yancey says Emtrain is careful not to show information to a client until a critical mass is reached, in order to make sure the data isn’t skewed, and to make sure privacy is preserved.
A Brief Note About Production Value
One curiously overlooked value in getting good data? Production value. Anyone who’s sat through hopelessly dated, stiffly acted harassment-training scenes knows that nothing extinguishes serious engagement like a firehose of unintended irony.
“We’ve had 30 years of ridiculous harassment training where people are still in bell-bottom jeans,” Yancey lamented.
Emtrain invests proportionally more in video production than even large competitors, according to Yancey. Emtrain’s in-house crew emphasizes convincing performances, naturalistic dialogue and quick turnarounds. COVID-19-specific videos, for example, were written and in the can within three weeks. Yancey looks back at those decisions as seminal moments for the business, “because just like marketing, it’s all about the content.”
Predict, Don’t Just Report
Human resources has been around in some fashion at least since the industrial revolution, but the concept of HR analytics didn’t arise until the 1990s, alongside the emergence of PeopleSoft and Oracle. And even then, it was primarily a simple reporting function.
Things have gotten more advanced since then. In a 2019 Oracle survey of HR professionals and finance professionals, 14 percent more HR respondents said they could use either predictive or prescriptive analytics than the finance group, often considered among the more analytically savvy sectors out there.
But that shift traditionally hasn’t found its way to bias and harassment training. In the book Data-Driven HR, for example, futurist and strategy advisor Bernard Marr doesn’t mention either, coming closest only when discussing corporate culture analytics. Here, he pinpoints safe-but-not-always-reliable tools such as surveys, focus groups and employee interviews, plus ostensibly more accurate (but creepier) text and sentiment analysis of “internal intranet sites, social media and internal written communication.”
“Most of the metrics and measurements in all HR are lagging indicators.”
There’s also the general problem of after-the-fact reporting.
“Most of the metrics and measurements in all HR are lagging indicators,” said Mitch Zenger, an HR expert and founder of Synctrics, a startup focusing on HR systems integration and employee data ownership. The numbers might reflect who did or did not receive specific training or certification testing, but they often can’t add much in terms of trend-spotting. “It’s not really predictive of future value, future opportunity and [determining] what you need to be doing going forward,” he added.
Tom Penque, a lecturer at Northeastern University’s human resources management program, also echoed the need to push further into predictive analytics. He also stressed that qualitative measurements — like harassment training — are harder to gauge than more quantitative ones.
These are the kinds of needles Emtrain is working to thread. One of the company’s clients, a large Bay Area tech company that Yancey declined to name, terminated a team member after they made inappropriate comments during a meeting, in the wake of the wide-scale racial justice demonstrations. They had also shown themselves to have limited experience with people from backgrounds different than their own — a potential influence on bias — during previous Emtrain testing.
“If there’s a vulnerability, an area for opportunity, then we’ll tee that up through the data to the employer and they can see, ‘Oh, that’s an area that we want to watch,’” Yancey said. The client used Emtrain’s data “to help inform the decision, and they used our color spectrum to help communicate the reasons why.”
Of course, any time the conversation turns toward predictive analytics, the question of machine-learning predictive models follows closely behind. HR on the whole has been slow to adapt artificial intelligence into its toolkit; only distribution and legal rank lower in terms of functional parts of companies that use AI projects, according to a recent O’Reilly survey.
That might not be the worst thing since, in certain corners of people analytics, some of those algorithms have proven, well, unfit for work.
A significant body of research has shown that, for example, facial and text analysis AI tools used for employee screening often reinforce bias. (Companies like Blendoor have emerged in recent years, dedicated to making these kinds of algorithms equitable.)
Emtrain uses machine learning only in one very narrow application. The platform includes an ask-the-expert feature in which clients can put questions to various subject-matter experts. As users type questions, a machine-learning text generation system populates other questions they might be interested in, based on their initial one. “In that sense, it’s relatively safe,” said Yancey, “and not going into any employer decision-making criteria.”
The focus must remain data-driven, but, ultimately, people-driven too. “We’re trying to partner with companies to optimize for good outcomes — good outcomes like ethics, respect and inclusion,” she said.