ARTICLES, BLOGS & VIDEOS

The latest news, trends and information to help you with your recruiting efforts.

Posted October 17, 2016 by

The hidden problem with big data

1547174HR has long measured recruitment success. Now, in the age of “big data”, we are generating so much more to measure. One benefit of analyzing big data is that with more information we’ll have better decision-making and reduce the stubborn subjectivity that comes with using human brains.

Right?

We should be cautious to assume that human bias will disappear just because we have more analytical tools at hand. In fact, big data can expose our bias and force you to walk the walk. Once you track all those numbers, some unconscious bias and unintended discrimination may emerge and will now be in plain sight. Ultimately, this accountability is a great step forward in recruitment. You’ll just want to make sure your company is ready to respond. Here are three examples of where it’s wise to examine your data practices.

Scraping personal data from online sources. It wouldn’t be too hard to discover a candidate’s race or sexual orientation, given how much personal trace we all leave on the Internet. We’d love to assume those factors make no difference, but too many studies have shown otherwise. Some minority job applicants have even resorted to “whitening” their resumes. A study published this year showed that minority applicants were more successful if they deleted information from their resume that hinted at their race, for example, if they attended a Historically Black College or were a member of Hispanic professional association.

Key word searching. Keyword searching can be a great way to sort out quality candidates among the thousands of real or potential applicants. However, employers must “apply the same rigor that they would use when creating job advertisements. For example, avoid any terms that could be considered directly or indirectly discriminatory (e.g., ‘‘recent graduate,’’ ‘‘highly experienced.’).”

Hiring tests. Many companies give candidates a test at the interview stage to help them make decisions based on qualitative data. It sounds great, and can be, if it’s administered fairly. If you use these tests, you must use them for all applicants. And you must—gasp!—actually pay attention to the data. For example, it wouldn’t be fair to only give the test to minority candidates (this happened), or ignore White candidates’ bad test results (this happened too).

Using big data can be used to make good hires. Just don’t forget to be honest with yourself. If you analyze a big pool of data to select qualified candidates, and they all end up being of one race and one gender, this is a sign you may have accidentally inserted your own bias. Go back to the steps in your process. Ask yourself, “Are my words or actions appealing to only certain demographics?” (This recruiting tech company uses their own big data to help you look at wording in your job postings, for example.) As one of America’s most popular economists, Stephen Dubner of Freakonomics fame, puts it:

We believe that if you get a pile of data representing a million decisions, that that’s better than asking three people what decisions they made. While I very much believe that to be true, and I very much applaud all the instincts for all of us to work with data in aggregate to distill the biggest truths, I also know that we’re humans and that …we’re biased in a lot of ways.