AI tool helps detect racial, cultural bias in the workplace
A key factor that often determines if employees get a raise or a promotion is a performance review. As diversity and inclusion emerge as key values, managers are now being reviewed whether unconscious race or cultural bias has treated people unfairly.
It’s being done using artificial intelligence.
The data security company Text IQ is offering a free unconscious bias detector that analyzes how managers evaluated their team.
San Francisco-based company Clockwise uses a calendar analyzer with artificial intelligence that has the ability to make us more productive while working from home.
“The machine could say, hey, when this reviewer is reviewing this particular group, they tend to use language that they don’t use when they review a different group,” explained Omar Haroun, co-founder and COO of Text IQ.
Text IQ did research with academics, ethics leaders and consultants to detect unconscious bias along racial, cultural and gender lines.
Here’s an example:
“When you talk about women, you tend to refer to how they have a bubbly and energetic personality, or certain traits about their personality as opposed to their actual work product, and when you’re reviewing men, you always talk about their work product,” said Haroun.
It’s like holding up a mirror to managers to see their flaws. Their biases could be holding staff from advancing. They could lead to retention issues or potential lawsuits.
Artificial intelligence is looking at data in the performance reviews.
“Are you just giving more time to certain employees that you’re, you know, have biases toward? Are you more of an advocate for them? Are you using language that is more of, I like this person?” said Maurice Ducoing, principal at Ducoing Human Capital.
Ducoing believes awareness is important along with training to correct it.
“If they don’t pay attention to this, that’s really what’s going to move the needle, and we’re going to go from maybe taking these baby steps to just jumping,” he said.