DOJ warns that misuse of algorithmic hiring tools could violate accessibility laws

AI tools for recruitment processes have become a popular category, but the Department of Justice warns that careless use of these processes could violate US laws protecting equal access for people with disabilities. If a company uses algorithmic classification, face tracking, or other advanced methods to classify and evaluate candidates, it’s a good idea to take a closer look at what they do.

The Department’s Equal Employment Opportunity Committee, which observes and advises on industry trends and actions related to the same-name issue, published guidelines It explains how companies can safely use algorithm-based tools without the risk of systematically excluding people with disabilities.

“New technology should not be a new way to discriminate. If employers know how AI and other technologies can discriminate against people with disabilities, they can take steps to prevent it,” said EEOC Chairman Charlotte A. Burrows. press release Announcement of guidance.

The general meaning of the guidelines is to think hard about whether these filters, tests, metrics, etc. measure the quality or quantity related to the performance of a task and seek input from the affected group. Here are some examples:

  • Applicants with visual impairments must complete a test or task with visual elements to qualify for a game-like interview. As long as there is no visual element in the job, this unfairly cuts off blind applicants.
  • A chatbot screener asks a poorly expressed or designed question, such as how many hours a person can stand upright, and answers “no” to disqualify the applicant. A person in a wheelchair can certainly do many of the things that would otherwise require a seated position to stand alone.
  • AI-powered resume analysis services lower application rankings due to recruitment gaps, but those gaps may arise for reasons related to disability or disadvantage.
  • Automated voice-based screeners require volunteers to answer questions or voice-test problems. Naturally, this excludes people who are deaf and hard of hearing, as well as speech impaired. Unless your job involves a lot of horses, this is inappropriate.
  • A facial recognition algorithm evaluates someone’s emotions during a video interview. However, the person is either nerve-dividing or has facial paralysis from a stroke. Their scores will be outliers.

This does not mean, however, that any of these tools or methods are wrong or fundamentally discriminatory in a way that violates the law. However, companies using them should be aware of their limitations and provide reasonable adjustments when algorithms, machine learning models, or other automated processes are not suitable for use with a given candidate.

Having accessible alternatives is not only part of it, but being transparent about the hiring process and declaring in advance which skills to test and how to test them. People with disabilities best determine what their needs are and, if necessary, what accommodations they need.

If the Company does not or cannot provide reasonable accommodations for these processes (including, for example, processes built and operated by third parties), it may sue or be held liable for this failure.

As usual, the sooner you consider these kinds of things, the better. Contact an accessibility expert if your company has not already spoken to an accessibility expert on issues such as recruitment, website and app access, and internal tools and policies.

Meanwhile, you can Read the DOJ’s full instructions here.A concise version for workers who feel they may be discriminated against HereAnd for some reason there are other truncated versions of the instructions. Here.

Leave a Comment