fb-pixel Skip to main content
Perspective | Magazine

To land some jobs now, you have to impress an algorithm first

Software is screening resumes and analyzing video interviews. But what if you want to be treated like a person?

adobe stock

I’ve been pursuing a master’s degree for the last couple of years, and I’m worried about hitting the job market, mostly because job searches seem to have changed since I went back to school.

Companies are screening candidates by using automated video interviews, parsed by algorithms. One of my classmates had an interview with a beauty company last spring. She didn’t go to its offices or talk on the phone with someone — instead, she sat alone in a library study room. Questions flashed across her laptop screen, each time giving her a few minutes to speak her answers to the webcam. The company recorded her responses, to be evaluated later. Another classmate had a similar experience with an educational institution. This trend is set to spread: Almost half of respondents to a 2019 Deloitte global survey of executives and workers expect technology’s role in interviewing will increase over the next three years.


Employers say technology lets them access a wider pool of candidates, speeds the hiring process, shrinks time employees spend interviewing, and saves money. Video interviews save time for job candidates, too, says Sean Rogerson, managing director of the Boston office for employment agency Michael Page.

But my classmates say they found these interactions awkward, devoid of the normal back-and-forth with the interviewer. Neither made it to the next round, perhaps no surprise, since recruitment software can now identify mood based on how your voice sounds. In fact, these programs can look at your video interview and identify your education level, whether you’re lying, and your cognitive abilities, according to a Deloitte report. HireVue, one such video interviewing platform (customers include the Red Sox, Ocean Spray, and Dunkin’ Brands), uses predictive analytics to assess and filter job candidates on their vocabulary, intonation, and body language, including facial expressions. It can also compare applicants with the “traits of top performers.”


I fear that after a stilted, one-sided chat with a camera, my résumé will be automatically relegated to the “do not hire” database.

I worry about what gets lost without human contact. The majority of the impact in an interview comes from nonverbal messages, says Patti Wood, an expert in nonverbal communication. She says nonverbal cues have 4.3 times the impact of words. In video interviews, some cues get lost, she explains, so I’m not wrong to fret about my camera presence. Wood says people tend to be more spontaneous, less in performance mode, and more in the moment in natural face-to-face interactions.

Of course, I’ll have to get my résumé past the screening algorithm to get to the video assessment algorithm. I could be the perfect candidate but if I don’t use the keywords the system is programmed to find, Rogerson says I could wind up in the reject pile. Even before that, though, I have to get past the algorithm deciding which job listings I see.

A 2015 study found that women received fewer ads for high-paying jobs than men do. Another study found that because younger women are a valuable demographic to online advertisers, it costs more to show them ads, so algorithms on major digital platforms — optimizing for cost effectiveness — showed women fewer job ads in science, technology, engineering, and math. Just last year, Amazon reportedly scrapped an AI recruitment tool that was biased against women despite efforts to fix it.


Maybe the algorithms will get better, but bias isn’t easy to fix because algorithms don’t write themselves — humans do. Machine prejudices reflect human prejudices. Upturn, a nonprofit in Washington, says without active measures to mitigate biases, these tools will be biased by default. It has called on companies and vendors to be transparent about their tools, allow for those to be independently audited, and asks vendors to takesteps to remove bias from them.

Automated interviews can perpetuate bias, too. Even if the algorithm works well, the playing field for job candidates isn’t always level, because not everyone has the same access to technology, audio-visual quality, and quiet conditions conducive to video interviews.

And then there’s the legitimate concern about what happens to applicant interviews once they are recorded. Are companies keeping this sensitive data confidential, and protecting it securely? Do they delete candidate files when the job is filled? Or does the interview and assessment become just another piece of data for a person’s profile, at risk of being sold to marketers and data brokers, packaged and divulged and perhaps even finding its way onto the Internet like other personal information?

These new approaches may backfire on companies, because we job candidates are also assessing them. And we generally prefer dealing with people. In 2017, 76 percent of Americans told one Pew Research Center survey that they wouldn’t want to apply for a job that used a computer program to select applicants. Another Pew survey found that two-thirds of respondents deem automated video analysis of interviews unacceptable, and 57 percent feel the same way about automated résumé screening.


I wonder if I’ll even have a choice. I know if I’m invited to do a video interview conducted by a program, I’m going to ask if a face-to-face chat is possible.

Cindy Govender is a freelance writer. Send comments to magazine@globe.com. Get the best of the magazine’s award-winning stories and features right in your e-mail inbox every Sunday. Sign up here.