Advice for Employers and Recruiters
How do we know from a resume that matches well with a job ad that the candidate is well-qualified?
There’s an expression that applies to so many situations: garbage in, and garbage out. It applies well to the outdated HR practice used by almost all employers: assuming that candidates must be more likely to be highly productive if their resumes match well with the job posting ad to which the candidate applied. What employers have been discovering over the past couple of years, and it is only getting worse, is that it’s actually not the case.
The rise of AI-assisted job applications has greatly reduced the cost to the candidate to apply to jobs. Decades ago, those applying had to pay to photocopy and then either mail or fax their resumes. There were real, monetary costs to applying to jobs, so few candidates applied to more than a dozen or two. Then, job boards and other online applications made it far cheaper, so candidates started to apply to dozens and, sometimes, a couple of hundred. Now, many candidates apply to hundreds and, sometimes, thousands. In addition to employers having to plow through more applications, which the larger ones have automated through the use of ranking / matching / scoring software, they’re finding that it is harder and harder to differentiate between the applicants who are best qualified because the applications are looking more and more alike. The ranking software isn’t performing well for that reason, but also for the reason that it often doesn’t work well.
Employers and their HR vendors don’t want to admit it, but there’s little to no data to support the widespread and largely accepted belief that a resume that ranks higher than another in terms of matching the job ad means that the candidate is likely to perform better. But here’s the problem: no one ever hires the candidates whose resumes don’t rank well, and so there’s no way of determining if that belief is actually correct. It’s a little like taking a vitamin to help your health because it seems to make sense, but if there’s no scientifically validated study to measure the effectiveness of the vitamin, how do you know it actually does what it says it does? It could be ineffective or even harmful. And if the vitamin does such good as the marketers say it does, why isn’t there a study proving that? Wouldn’t they want that to help make their sales process easier?