Using voice and face recognition software, HireVue lets employers compare a candidate’s word choice, tone, and facial movements with the body language and vocabularies of their best hires. The algorithm can analyze all of these candidates’ responses and rank them, so that recruiters can spend more time looking at the top performing answers.
Each candidate answers the same questions, so it's a standardized interview, which is supposed to make things better because human beings have biases and make judgments and stuff.
|We're only hiring guys named Dave|
The app is discussed briefly in this piece entitled "New App Scans Your Face and Tells Companies Whether You're Worth Hiring."
HireVue claims to have completed four million interviews already while working with over 600 companies (including Nike, Tiffany, and Honeywell). They also offer an assessments service ( "to identify best-fit talent without the painful experience of traditional assessment.") And they can do "structured video coaching that reveals team readiness in real time." Just the thing for the harried "Talent Acquisition Leader" who just finds it too stressful to exercise some professional judgment. And for applicants. HireVue even offers some youtube interview tips.
This is several types of creepy, though it could certainly cut down on the wear and tear and travel of interviewing. But Monica Torres at Ladders cuts pretty quickly to the problem-- the notion that computer software is somehow free of human biases. Software is written by human beings-- and this software uses the hiring your institution has already done as its baseline. And once again-- this is not artificial intelligence-- it's just a complex algorithm.
In other words, the algorithm is only as objective as the human minds that guide it. So if the employer’s ideal candidate is already biased against certain characteristics, HireVue’s platform would only embed these biases further, potentially making discriminatory practices a part of the process. Human recruiters would need to recognize their own personal biases before they could stop feeding them into HireVue. It’s one more reminder that behind each robot lies a human who engineered it.
None of this is directly linked to education-- yet. But in a world where test manufacturing companies are already promising they can kind of read test-takers' minds and other companies are promising that they can have your on-line course watch your every move and response, this is just one more indication of how far this trend of algorithmic displacement of human judgment can go. And never forget-- whenever the computer is watching it and measuring it, the computer is also storing it.
Could HireVue be tweaked so that it can match facial movement and body language of students with students that were deemed "successful"? Sure. In fact, it seems entirely possible that HireVue's algorithm about body language and facial expression could also easily track and quietly count skin color or gender characteristics. But it's a computer, so of course it's all facts and data and science-- not just a quick and efficient way to legitimize the bad and biased judgment of the individuals behind the screen. Remember to keep your eyes peeled for this kind of tech, because it already has its eyes peeled for you.