The adage that you can’t manage what you don’t measure has, understandably, also reverberated in diversity, equity and inclusion (DEI) initiatives. But it might need an addendum there: You can’t measure what you don’t protect.
Companies should gather and analyze employee demographic data for a variety of reasons — most importantly, of course, for improving DEI. But also, collecting data is the law. Private employers with more than 100 employees are required to report gender and race and ethnicity data to the U.S. Equal Employment Opportunity Commission (EEOC). And finally, improving DEI — which demographic data collection can help facilitate — is also good for business, in terms of both financial returns and talent acquisition.
“It’s a triple win, but it starts with the data you’re collecting,” said Aubrey Blanche, director of equitable design, product and people at employee engagement platform Culture Amp.
When Gathering DEI Data, Should You Stay Internal or Hire a Third Party?
- If you’re going with in-house outreach: Use encrypted survey tools with strong password protection, limit access to responses, allow respondents to edit replies and clearly communicate information about the process to employees.
- If you’re going with a third-party provider: Consider factors such as optimal threshold reporting settings, whether or not to include free text fields and whether employees are comfortable having survey software link with the company’s human resource information systems (HRIS).
At the same time, doing so responsibly requires care. Employees who submit information through, say, Google Forms are entrusting that the administrator has set up a secure password and restricted access. Companies also have to be sensitive to the fact that employees’ comfort level with disclosure may change over time, which means allowing respondents to edit survey replies. Meanwhile, companies that merely gather baseline EEOC-required demographic information at the time of hire, without following with more expansive inclusion surveys, have little more than broad, general insights.
In short, collecting and safeguarding sensitive data is itself a sensitive task, whether a company attempts to do so in-house or enlists a third-party service.
Recognizing the Sensitive Nature
The first thing companies can do to handle diversity and demographic data responsibly is to acknowledge that it is indeed sensitive data.
Yes, some traditionally reported categories are often visually identifiable — but only to a degree. Last names and appearances don’t always suggest a person’s ethnicity or race, gender identity can’t be assumed and many disabilities are invisible. The degree to which someone is comfortable reporting any of that information varies by person.
So while those who present as they identify might not consider such data sensitive, the opposite cases prove otherwise.
Shoshana Rosenberg, founder of SafePorter, which makes software for protecting diversity data, stressed the importance of being most careful with those who shoulder the most risk, even if that population is smaller in size. She explained by way of analogy: Most people moving between living spaces deal primarily with small furnishings and boxes, but the rare move that also includes a piano carries a greater risk of injury.
“That is where you take the greatest care,” she said.
Internal vs. Third-Party Subscription
After acknowledging the sensitive nature of DEI data, firms face a choice: handle the process of surveys internally or sign up for a third-party service? Some smaller outfits, for instance, may believe that enlisting an outside party isn’t worth the added cost.
Companies that do opt to handle diversity data outreach in-house should remember a few key points when it comes to maintaining privacy, according to Blanche:
- Use survey administration tools that offer encryption and create strong password protection.
- Limit access and have strong guidelines about who can view both anonymized and aggregated data.
- Collect the minimum amounts of data needed to make decisions.
- Provide people the ability to amend or delete data.
- Reflect all this information clearly in communications to employees.
It’s possible to handle this all internally without signing up for a third-party service, “but the level of complexity required is high in order to do it in a responsible way,” said Blanche.
Considerations When Choosing and Implementing Outside Solutions
For companies that opt for a survey provider, there’s an abundance of options. The market specializing in engagement and DEI surveys has boomed in recent years, marked by high-profile acquisitions and big-dollar valuations. But would-be clients still have to be thoughtful not only about how they chose a platform, but also about how they choose to implement a solution, in order to maintain data privacy.
Some of these considerations may be more obvious than others. For example, the importance of confidentiality is apparent. Tools that keep identities confidential by showing aggregates and percentages, unconnected to personally identifying information, help maintain a premium on privacy while still delivering insights.
But other factors — such as threshold reporting settings, whether to include free text fields and decisions about integration with human resource information systems (HRIS) — are less obvious, but still important.
One important facet to understand is threshold reporting. That’s the idea that, unless a certain number of respondents report a category, that response is filtered out of reports that could identify that individual. It speaks to the central push-and-pull tension inherent between the diversity and human resources side of the equation and the privacy and risk management side.
Culture Amp sets this number, what it calls the reporting group minimum, at five. Clients can adjust the figure, but that’s considered a good, broad standard for preventing identification while still limiting the potential for employee responses to be filtered out.
Similarly, to prevent indirect identification, Culture Amp heatmaps — which can break down feedback around inclusion categories like engagement, fairness and belonging along demographic lines — will hide the two smallest reporting groups if the smallest reporting group fails to hit the threshold.
Rosenberg also stressed the importance of keeping scarce categorical responses outside of granular reporting that would leave an individual open to identification, due to the small sample size. The SafePorter system holds that information “in escrow” until more similar responses clear a threshold, she said. The employer using the system would know that that employee’s group is underrepresented at the company without increasing the likelihood of inadvertently disclosing that employee’s identity information.
At the same time, the lower the headcount a company has, the greater statistical significance an employee response carries. So for companies with less than 100 people, SafePorter creates specialized survey tools outside of the threshold-supported system, “so we can get them the aggregate they need in a way that keeps people safe.”
Free Text vs. Multiple Choice
How surveys are built can have implications for diversity data privacy as well — something companies should be aware of when considering DEI engagement software and how they choose to set it up.
For instance, offering respondents the ability to self-identify in a text box helps people most fully describe their gender identity, but free-text fields can also risk reidentification, or matching anonymized personal data to a specific individual, as people could inadvertently include personally identifying information, according to Rosenberg and Blanche.
Blanche recommends that most companies use three gender options in surveys, with one encompassing all trans and gender-non-conforming individuals. (Culture Amp’s diversity survey template can list up to five gender identities and seven sexual orientations.) She noted that a free text space alongside three options could be an appropriate way to balance privacy, identification and data usefulness considerations. Though, if free text fields are analyzed collectively, as they are in Culture Amp, they likely wouldn’t meet threshold minimums and would therefore fall off reporting — good for privacy, if not for actionability.
“What you’re trying to do from a utility perspective is break people into groups that are different enough that you’re learning about their experience, but large enough that you can get statistical power,” she said.
Rosenberg, on the other hand, eschewed free text altogether in the SafePorter design. Demographic survey respondents answer multiple-choice questions, with some sections supporting extensive subcategory options. For example, the race and ethnicity portion also includes a cultural heritage field that uses predictive search, populating from the array of options as the respondent types.
“If you’re Cape Verdean and Irish, whatever it is … you can select multiple things and really round out who you are,” said Rosenberg.
The aim is to keep the statistical significance of broader categories while still allowing fuller identity expression and privacy.
Integration with HRIS
Whether or not diversity data surveys touch other systems is also a consideration. Blanche and Rosenberg both agreed that survey systems that bypass HRIS might help create more buy-in.
According to Blanche, companies regularly miss up to 80 percent of disability data from employees, especially when surveys are tied to an HRIS. HRIS systems are where employees go to complete administrative tasks like benefits enrollment and time-off requests, but also, for many users, job-appraisal modules.
“Because the HRIS is more connected to compliance, performance reviews and compensation, it feels like a more visible place to put that data, and often feels like a bigger risk for people, especially from marginalized backgrounds, to put that data in that system,” said Blanche.
There are upsides to integrating with HRIS platforms, it should be noted. Integration can help avoid double-reporting requests, for instance. But companies should be aware of how employees perceive those systems before looping them into mechanisms for inclusion feedback.
After all, you can’t measure — or protect — what you haven’t been entrusted to collect.