Ten U.S. senators have asked the U.S. Equal Employment Opportunity Commission (EEOC) to investigate employers’ use of artificial intelligence (AI), machine-learning, and other hiring technologies that may result in discrimination. In a joint letter, they have asked the EEOC to proactively analyze AI-based candidate evaluation tools to determine if the use of AI introduces discriminatory biases. These tools include:

  • Software used in the employee selection process to manage and screen candidates after they apply for a job
  • New modes of assessment, such as gamified assessments or video interviews that use machine-learning models to evaluate candidates
  • General intelligence or personality tests
  • AI-based Applicant tracking systems

The senators asked the EEOC to indicate its authority to proactively investigate potentially discriminatory practices, and opined that the risk of bias in technologies, combined with the significant unemployment gap between Black and Latino workers and their White counterparts, requires the EEOC to “conduct robust research and oversight of the industry and provide appropriate guidance.” The analysis is requested as individual candidates do not have the ability to determine or prove that they have been unfairly evaluated by the use of biased technology.

A similar request was sent by members of the Senate to the EEOC in 2018 addressing the use of facial recognition technologies in the workplace. The senators expressed doubts about whether the use of such technologies would reduce human biases or “actually amplify those biases.” In addition to evaluation, the letter requests increased enforcement. nextSource encourages employers to evaluate their current and planned use of AI-based hiring and employment tools for potentially disparate impact on protected classes of workers and, when purchasing AI-based applications, require that the vendor articulates the rules being applied and indemnifies the buyer against any future adverse judgements by the EEOC or other agency or litigation.