From labor disputes cases to labor and employment publications, for your research, you’ll find solutions on Bloomberg Law®. Protect your clients by developing strategies based on Litigation...
Oct. 14 — The Equal Employment Opportunity Commission someday may issue guidance on how employers can best use “big data” to build and sustain a productive workforce, but for now, the agency’s in a learning phase.
At an Oct. 13 commission meeting, EEOC Chair Jenny Yang (D) said the agency needs to know more about the computer-generated algorithms, data “scraping” and other online tools that an increasing number of employers are using to recruit, select and evaluate applicants and employees.
Some witnesses, including a management lawyer, said use of big data expands the job applicant pool and can reduce or even eliminate unlawful bias in employment decisions. But others warned the EEOC that big data techniques being imported from marketing into the employment arena could have discriminatory effects and infringe on individuals’ privacy interests.
Big data is “here to stay,” so the EEOC has a responsibility to understand it, Commissioner Chai Feldblum (D) said.
The agency should do what it can to enhance big data’s potential beneficial effects and to minimize its negative ones, she said.
Determining how the federal anti-discrimination laws “apply in our increasingly technology-driven workplaces” is at the “core” of the EEOC’s responsibilities, said Commissioner Victoria Lipnic (R), who played a leading role in organizing the meeting.
It’s “critical” that big data tools “are designed to promote fairness and opportunity” so employers’ reliance on expanding sources of data “doesn’t create new barriers to opportunity,” Yang said.
Employers are turning to big data to assist in recruitment, hiring and other employment decisions because it’s proven successful in other parts of their businesses, said Marko Mrkonich, a management lawyer with Littler Mendelson PC in Minneapolis.
A recent survey of 279 Society of Human Resource Management members found that about one-third of their organizations are using big data in employment and the proportion is higher among larger employers, said Eric Dunleavy, the director of personnel selection and litigation support services at DCI Consulting in Washington, which advises federal contractors and other employers.
Dunleavy testified on SHRM’s behalf.
Algorithms based on “data points” that correlate the characteristics of successful current employees with those found in “passive” candidates found online as well as actual applicants can improve diversity by finding minority, female or disabled individuals who would otherwise be overlooked, Mrkonich said.
If the computer model is properly designed and implemented, it also can reduce or eliminate a human decision maker’s potential bias, said Michal Kosinski, an organizational behavior professor at Stanford University’s Graduate School of Business.
The promise is that a neutral model with objective criteria used to scan the internet for individuals possessing the characteristics that have been correlated with job success will produce a broader candidate pool untainted by bias.
But one rub is that the models measure “correlation,” not “causation,” said Kathleen Lundquist, an organizational psychologist who is president and chief executive officer of APT Metrics Inc., which has its headquarters in Darien, Conn.
The algorithm therefore can’t be validated as “job-related” because it doesn’t actually look for the knowledge, skills and ability necessary to do the job, Lundquist told the commission.
The models also combine thousands of data points, meaning potential job candidates aren’t necessarily being evaluated based on the same criteria, Lundquist said.
It’s also possible that using job performance reviews to identify successful current employees and put their common characteristics into the formula could perpetuate past discrimination, she said.
Commissioner Constance Barker (R) expressed concern that increased reliance on big data could disfavor individuals or groups that have less of a “digital footprint.” It could disadvantage those who have purposely kept a low profile on the internet or who lack access, she said.
That gets to the heart of the matter, said Ifeoma Ajunwa, a law professor at the University of the District of Columbia.
Employers’ use of big data can “privilege people who already have access” and leave less-privileged others “further in the margins,” Ajunwa told the EEOC.
But digital footprints aren’t confined to people active on the internet, Mrkonich said. Rather, anyone who banks, uses a credit card, drives a car or engages in almost any other transaction leaves a footprint, he said.
Employers that use big data still use traditional methods as well, such as job fairs, which allow them to reach individuals who might lack a robust online presence, Mrkonich said.
The third-party vendors of big data solutions in employment are pitching competitive success, reduced employee turnover and more productivity, he said.
They also are promoting “a more effective and fair model for employment decisions” less tainted by the biases of human decision makers, Mrkonich said.
It’s important that employers using big data models audit their results to ensure the techniques aren’t disproportionately excluding members of protected groups, Lundquist said.
She said employers interested in best practices should:
Big data isn’t confined to algorithms used for finding, recruiting and potentially hiring job candidates, Ajunwa said.
Rather, it extends to current employees who are subject to workplace monitoring and even wearable tracking devices, she said.
Employers justify the use of such methods as a way to promote workplace efficiency and productivity, but they can raise privacy issues, Ajunwa said.
This category can include personal information collected by a third-party vendor as part of an employer-sponsored wellness program, she said.
The employer should have safeguards to ensure the vendor isn’t packaging and potentially selling the aggregate health-care data, Ajunwa said.
There’s potential for violations of the Americans with Disabilities Act and the Genetic Information Nondiscrimination Act as well as federal health privacy laws, she said.
“We shouldn’t envision a workplace where workers must surrender all privacy rights in exchange for a job,” Ajunwa said.
The computer scientists who develop the models and the employers who use them need to be educated about how federal anti-discrimination laws might apply, said Kelly Trindel, chief analyst in the EEOC’s Office of Research, Information and Planning.
Developers also need to be trained regarding the relevant demographics if the employer’s goals include outreach to previously underrepresented groups, Lundquist said.
The EEOC shouldn’t jump in before it better understands the potential and actual effects of big data in employment, Mrkonich said.
Given their comments, the commissioners appeared to agree that questions outpace the answers at this stage.
To contact the reporter on this story: Kevin McGowan in Washington at email@example.com
To contact the editor responsible for this story: Peggy Aulino at firstname.lastname@example.org
The witnesses’ written statements and biographies are available at https://www.eeoc.gov/eeoc/meetings/10-13-16/index.cfm.
Copyright © 2016 The Bureau of National Affairs, Inc. All Rights Reserved.
Notify me when updates are available (No standing order will be created).
Put me on standing order
Notify me when new releases are available (no standing order will be created)