The manufacturing industry is accustomed to strategically integrating automation tools and technologies to address labor shortages and promote workplace safety and productivity. Manufacturers exploring potential uses of artificial intelligence (AI) in recruiting as a natural extension of their automation efforts should consider legal issues, including potential discrimination and bias inherent in such use.
AI in manufacturing may be able to measure cognitive impairments to ensure safe and appropriate use of equipment, such as for fatigue monitoring and other assessments, overcoming implicit bias in the process when a manager performs the same evaluation. That said, not all AI systems are created equal and AI itself is not inherently unbiased.
With or without AI, discriminatory hiring decisions based on race, color, religion, sex, national origin, age, sexual orientation or gender identity, physical or mental disability , military status, genetic information and any other protected status are prohibited by law. . Executive Order 14110, issued October 30, 2023, recognizes the inherent potential for unlawful discrimination when AI is used in the hiring process. Office of Federal Contract Compliance Programs (OFCCP) tips for federal contractors highlights that AI is only as unbiased as the data it learns from and can thus perpetuate inequalities already present in the hiring process and workplace. Equal Employment Opportunity Commission (EEOC) tips for employers on the use of AI in job selection procedures notes that employers can assess “an adverse impact on a particular protected group by verifying whether the use of the procedure results in a selection rate for individuals in the group which is “significantly” lower than the selection rate for individuals. in another group. »
Using the seemingly innocuous parameter of significant time gaps in the applicant’s work history to screen out resumes, for example, can result in the automatic disqualification of applicants who took leave for the birth of a child, which will impact disproportionately on female candidates. The result (the elimination of otherwise qualified female applicants) may be directly attributable to underdeveloped or poorly implemented AI systems, but manufacturers using these systems may be held liable for any discrimination or bias that results. would result.
Likewise, because the nature of manufacturing operations often requires a 24/7 work environment and flexible work schedules, part of the hiring process may consider candidate availability, including nights and weekends, to work shifts. To avoid potential discrimination claims, AI screening should be calibrated to avoid automatically disqualifying applicants whose weekend unavailability is attributable to religious observances, including attendance at religious services.
Although the OFCCP guidance directly concerns federal contractors and the EEOC guidance is considered technical assistance, all manufacturers should take note of the concerns raised regarding the use and integrity of AI in the hiring process. Manufacturers should carefully review and evaluate the AI used. They should also:
- Find out what steps the vendor is taking (such as assessments and system bias studies) to determine whether the use of AI may result in a significantly lower selection rate for applicants in a legally classified category. protected.
- Analyze and articulate the link with employment of the hiring selection procedure
- Periodically review the use of AI to assess any inherent bias or discrimination
Finally, manufacturers should consider state and local laws regarding AI during the hiring process, including in Illinois, Maryland, and New York City. Illinois has expanded its AI-related requirements by amending the Illinois Human Rights Act to prohibit the discriminatory use of AI (intentional or unintentional) in recruiting, hiring and all other aspects of the employment relationship from January 1, 2026.
(Law graduate Madelyn Foster contributed to this article.)