© 2024 254 North Front Street, Suite 300, Wilmington, NC 28401 | 910.343.1640
News Classical 91.3 Wilmington 92.7 Wilmington 96.7 Southport
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations
CAPE FEAR MEMORIAL BRIDGE CLOSURE: UPDATES, RESOURCES, AND CONTEXT

How Startups Are Using Tech To Try And Fight Workplace Bias

Gary Waters
/
Ikon Images/Getty Images

We all harbor biases — subconsciously, at least. We may automatically associate men with law enforcement work, for example, or women with children and family. In the workplace, these biases can affect managers' hiring and promotion decisions.

So when Pete Sinclair, who's chief of operations at the cybersecurity firm RedSeal, realized that — like many other Silicon Valley companies — his company had very few female engineers and few employees who weren't white, Chinese or Indian, he wanted to do something about it.

"I was trying to figure out, 'How do I expand my employment base to include those under-represented groups?' Because if we do appeal to those, we'll have more candidates to hire from," he says.

Sinclair figured the company was either turning off or turning down these minorities, so he turned to another software startup called Unitive, which helps companies develop job postings that attract a range of candidates, and helps structure job interviews to focus on specific qualifications and mitigate the effect of the interviews' biases.

Companies often err by using phrases like "fast-paced" and "work hard, play hard," which telegraph "mainstream male," says Unitive CEO Laura Mather. Instead, she encourages firms to use terms like "support" and "teamwork," which tend to attract minorities, in job descriptions.

Such adjustments seem to have worked for RedSeal: Sinclair says job applications shot up 30 percent, and the percentage of women among the company's three-dozen engineers has doubled.

"Our last hire was a Middle Eastern woman who would've frankly, in the past, never applied for the job much less gotten hired, just because she didn't fit the mold of people we hired," he says. "And she's turned out to be one of our top team members."

Sinclair says the motivation to diversify wasn't altruism. His company competes with Facebook and Google for talent, so it had to look off the beaten path and draw from a more diverse pool.

The idea that everyone makes automatic, subconscious associations about people is not new. But recently companies — especially tech firms — have been trying to reduce the impact of such biases in the workplace.

Unitive's Mather says companies realize group-think is harmful to the bottom line.

And research shows that "getting in different perspectives into your company makes your company more innovative, more profitable, more productive," Mather says. "All kinds of really great things happen when you stop making decisions based on how much you like the person's personality."

Unitive's software is based on social science research, including work by Anthony Greenwald, a psychologist at the University of Washington who developed the seminal implicit-association test in the 1990s. It measures how easy — or difficult — it is for the test-takers to associate words like "good" and "bad" with images of Caucasians or African-Americans.

Greenwald has tested various words and race associations on himself. "I produced a result that could only be described as my having relatively strong association of white with pleasant and black with unpleasant," he says. "That was something I didn't know I had in my head, and that just grabbed me."

No matter how many times Greenwald took the test, or how he tried to game it, he couldn't get rid of that result. He was disturbed, and also fascinated. Research indicates that unconscious biases tend to stay constant, he says, making them very hard to address within organizations.

"People who are claiming that they can train away implicit biases," he adds, are "making those claims, I think, without evidence."

So rather than trying to get rid of bias, Greenwald and other experts advocate, instead, mitigating their effect. Companies could remove identifying information from resumes, for example, or conduct very structured job interviews where candidates are asked the same questions and scored on the same criteria.

Some organizations are trying such methods.

Gap Jumpers, for example, is a startup that helps companies vet tech talent through blind auditions, which test for skills relevant to the job. That allows companies to avoid asking for a resume, which might include clues to a person's race or gender, says Heidi Walker, a spokeswoman.

Plus, Walker says, "That allows the company to actually see how a candidate will approach and develop solutions on the job." And, she adds, half their applicants are women.

Still, unconscious biases can affect all sorts of workplace behavior and decision-making, so addressing it can be a challenge.

A year and a half ago, cloud-computing company VMWare started training managers to identify their own unconscious biases, then started tracking their hiring, retention and promotion of women, which make up a fifth of their workforce. They also analyzed whether biases had seeped into employee evaluations.

It's been an eye-opening process, says Betsy Sutter, VMWare's chief people officer. "We have more work to do. A lot more work to do."

Copyright 2021 NPR. To see more, visit https://www.npr.org.

Yuki Noguchi is a correspondent on the Science Desk based out of NPR's headquarters in Washington, D.C. She started covering consumer health in the midst of the pandemic, reporting on everything from vaccination and racial inequities in access to health, to cancer care, obesity and mental health.