This week, Vox pointed out a leading author’s quoting of a questionable statistic in The New York Times Magazine. The statistic in question was related to the effectiveness of “blind auditions” in technical recruiting. Created by a recruiting firm and widely reported in media outlets, the statistic claimed that removing identifying details from technical resumes improved the interview rate of women from 5 percent to 54 percent.
To most people with deep experience in D&I, those numbers sound unreliable. And indeed, they are. It turns out that no one can find the underlying study in question, and the company that conducted it has gone out of business. Yet the misleading statistic lives on on the Internet and will probably turn up in reports and slides for years to come.
Every industry has its’ share of “phantom statistics.” But dubious numbers can be especially dangerous in D&I initiatives, where the desire for a quick fix can lead decision-makers to invest enthusiastically, then back away when initiatives are not showing results rapidly enough.
How to Keep Your D&I Program Evidence-Based
In the past five years, diversity and inclusion has leaped to the top of corporate agendas. The massive growth in demand has sparked an equal demand for information, not to mention thousands of blogs and conferences, all of which are hungry for content. It can be difficult to know what research to trust. Here are some thoughts on how to make sure your D&I program is based on evidence:
Know the Source. The gold standard is academic research. Major universities like Harvard, Stanford, MIT, INSEAD and others frequently publish peer-reviewed research on questions of equity. Consulting firms like McKinsey, Mercer and Heidrick & Struggles are also doing credible work, particularly in designing and interpreting large-scale surveys of their client bases. Companies like Google and Airbnb, and major non-profit organizations like Catalyst, have the scale and resources to invest in meaningful research, much of which is designed by social science PhDs. With smaller companies, non-profits and news organizations, it is worth investing a few moments to look at their methodology (see below).
Evaluate the Underlying Methodology. Of course, there should be a published methodology; the source should disclose the structure of the research, including who was involved, the number and demographics of the participants, the core questions asked, any analytical methods used, etc. It’s also critical to consider how the structure of a piece of research affects the usefulness of the conclusions for your purposes. For example, some academic research measuring bias is conducted by putting undergraduate students or Mechanical Turk participants through hypothetical scenarios. While these experiments are valuable because they allow independent manipulation of variables in a controlled environment, the findings often can’t be translated 1:1 to workplace situations.
- Consider Who is Interpreting the Work. Obviously, a vendor—whether a recruiting firm, a software vendor, a speaker or a consultant—wants to sell their product and therefore has an interest in validating their view of what “the problem” is. It’s always reasonable to ask questions to determine how deeply that person has engaged with the underlying research. Less obviously, individuals often have more interest/knowledge of one particular type of diversity than others and will tend to see all problems through that lens. This can create its own kind of confirmation bias—accepting a lower standard of evidence for statistics that support that person’s own point of view. When implementing (or approving) a D&I initiative, it’s important to seek perspective from a variety of stakeholders to ensure your own unconscious preference for one type of program isn’t influencing how you evaluate evidence.
We founded D&I In Practice to make sure that you have information you can trust to make business-critical decisions about diversity and inclusion. If you ever have a question about why we recommend an article or piece of research, ask us! You can always reach us at email@example.com.