Algorithms now impact nearly every facet of our lives, and that increasingly includes the workplace. From hiring and promotion to scheduling, algorithms play an essential role in analyzing data for decision-making or automating processes. They are even being used to address issues around diversity, equity, and inclusion (DEI) initiatives and eliminate human biases in hiring and promoting that restrict opportunities for women, people of color, and other historically excluded workers.
While these endeavors are admirable, algorithms aren’t foolproof. They can sometimes create challenges by discriminating against specific demographics of candidates during resume reviews or misidentifying people with facial recognition software. Nevertheless, organizations recognize “the work that needs to be done to reach absolute fairness and eliminate bias,” says Wendy Rentschler, DEI lead at BMC Software.
Melissa Dobbins, who founded the software-as-a-service startup Career.Place to remove bias from the hiring process, says there’s more to it than just bad coding — the data sets can also be problematic, with “patterns built in based on our previous performance and decisions.” In many ways, algorithms merely pick up on those biases and replicate them.
While some workplace discrimination is explicit, a lot is clouded in decision-making that prioritizes background such as education or skills that, either directly or indirectly, puts white male candidates ahead of others. As a result, even today, 13 U.S. companies are led by male CEOs for every company led by a woman, and women lead just 6 percent of global companies in the S&P 500. Racial diversity is also lacking as just 18 percent of Fortune 500 corporate board members and 20 percent of S&P 500 board members are people of color.
Since companies with diverse and inclusive cultures experience lower turnover, the way they use algorithms may also play a role in the so-called "great resignation" in workplaces across the country. Employees are quitting because they’re fed up with employers that do not value them, and while business leaders often talk about employee engagement, in many companies that clearly isn't happening.
Are algorithms a part of this problem, and can they be part of the solution? That depends, says Dobbins.
“Algorithms are tools that augment human processes, but they are not replacements,” she explains. “We can’t just throw a bunch of data at it, let the technology tell us what to do, and then blame the technology when we go too far in the wrong direction.”
Algorithms are a tool, not a solution
Over the past few years, and especially since last year’s wave of racial justice protests, companies of all sizes have doubled down on commitments to make their workforces and management more diverse and reflective of the communities where they do business. Yet, so far, the real-world results of these efforts have varied around the globe.
An increasing number of companies are turning to algorithms to remove bias from hiring and increase diversity among their ranks. “When it comes to talent acquisition, algorithms are using all types of different inputs, titles, keywords, patterns of usage within a system, then applying those patterns to predict or suggest an outcome,” Dobbins explains.
Frustrated with the bias she witnessed in hiring, Dobbins founded Career.Place with the goal of reducing unconscious bias by leveraging anonymity. Described as “a hiring solution for the modern employer and the empowered workforce alike,” Career.Place allows employers to bypass resumes until after they’ve determined a candidate has the traits and capabilities they need most. “It’s a very simple concept,” Dobbins tells us. “If you have anonymity, you are now considering more objectively what makes a good hire and spending your time there. If it’s not necessary to know for evaluating the candidate for the job, then you don’t know it.”
“Inequity is the result of decades and decades of decisions,” she continues. “So to say it’s simply the fault of the company, or an algorithm, is taking a small view of a much larger challenge.”
The bottom line: Algorithms in the workplace are here to stay, and it’s up to us to ensure they’re inclusive
If used properly, there is a role for algorithms in the workplace, as they can help a company identify patterns, monitor metrics, and, yes, automate tasks. But the human element and understanding the limitations of algorithms is vital to making progress on DEI.
“Technology is a tool, an augmentation to existing processes. So first and foremost, you have to have decent processes,” Dobbins says. “Technology is going to find patterns you never knew existed, but humans need to make sure those patterns matter.”
Integrating algorithms can shine a light on organizational and institutional blind spots. Rentschler of BMC Software acknowledges that digital transformation needs to deliberately impact DEI initiatives. For example,“organizations can pre-check job postings using gender decoders to identify subtle linguistic gender-coding that turns away talented applicants,” she says.
In the end, what an algorithm does is nothing more than a reflection — or amplification — of corporate culture or society at large.
Algorithms in the workplace are undoubtedly here to stay. But the human component is integral to ensuring algorithms work to empower, not exclude, diverse workers. It will be the key as companies look to meet their ambitious DEI goals and finally begin to turn the tide from the social legacy of decades of discrimination.
This article series is sponsored by BMC Software and produced by the TriplePundit editorial team.
Image credit: Seventyfour/Adobe Stock
TriplePundit editors offer news and insights on sustainable business.