AI & Its Intersection with Employment Law

0

Many of us are adapting to artificial intelligence taking on a more prevalent role in various aspects of business and work life, and most of us are doing so at varying paces, depending on industry, size of organization, types of tasks, and potential for risks. Where AI intersects with employment law, here are some best practices and common pitfalls which can inform the path you chart.

Hiring and Disability Considerations

Employers might use AI to review and filter job applications or resumes. On the plus side, applications can be quickly filtered out where they do not meet the minimum qualifications for the position, and candidates can be prioritized who best meet the skills and experience for the role. On the downside, AI can introduce unknown and unintended biases through the screening process, where it describes an ideal candidate to resemble personal characteristics held by previous successful candidates, rather than the skills required for the position.

Employers might use AI to aid in assessing interview performance. On the plus side, AI-powered interview platforms can assess candidate responses in video interviews, evaluate facial expressions, as well as tone and language, to predict skills and likelihood of success. On the downside, AI might be more or less accurate when assessing facial expressions or voice tone amongst candidates of different races, or due to speech differences, or based on facial paralysis, autism, or anxiety disorders. By design, these tools generalize what is “the norm,” and in doing so, may penalize people with differences, even where those differences present no advantage or disadvantage in the role.

In addition to generalizing “the norm,” AI systems might filter out candidates with disabilities before they have the opportunity to request a reasonable accommodation for the application process. Further, employers may not realize where AI tools intersect with and conflict with disability-related obligations. An employer might appropriately make a reasonable accommodation for an applicant, such as by affording someone a longer response time for an assessment exam; but using an AI tool that does not recognize or account for the accommodation could then negatively interpret the candidate’s slower performance on the exam.

Some of these risks can be mitigated by thoroughly vetting third-party software and services and understanding how they audit for bias, evaluate fairness, and describe the model’s logic and key features. Organizations can also develop their own internal bias audit systems. Users should be trained and supported in how to create system prompts that are effective in avoiding systemic biases. And processes can be established where employees involved in hiring and screening clearly track and document the data inputs, the system prompts, and the checks and balances used, in case the organization ever needs to demonstrate that it made best efforts to avoid and mitigate bias.

Performance Evaluations

In systematizing performance evaluations, AI can identify data-driven insights about employees, where a broader range of measures of productivity and success can be synthesized and considered in evaluating contribution. Where a non-AI supported performance review process might consider how many customer calls or tickets an employee addressed, an AI-supported process might also summarize information about task completion, size of those tasks, scope of that work, timeline for completion, customer satisfaction scores, measures of collaboration with internal colleagues, synthesized into a weighted scorecard. Incorporating data that might otherwise be impractical for an individual supervisor to gather can improve objectivity.

Further, in evaluating how a manager is issuing their performance reviews, AI could be used to flag a pattern of potential bias in evaluations, such as certain demographic groups getting systematically lower ratings or certain managers showing consistent positive or negative scoring tendencies across certain positions or teams. This can give HR or senior management a tool with which to evaluate the managers and the efficacy of their review process.

AI can also be used to turn a completed performance review into a personalized development plan for each individual employee, recommending targeted training programs or specific career development skills or skill gaps that can be addressed to help get employees ready for future advancement.

At the same time, managers should use their judgment to evaluate that such data is representative of the traits they want to emphasize, to verify that an individualized development plan is fair and balanced, and to be sure they have not abdicated in their own responsibilities to manage and understand their teams.

Workplace Investigations

AI might also be used to streamline workplace investigations, such as by pulling emails, summarizing information, or categorizing evidence for review. The investigator should still personally review the underlying data that bears on the fact-finding outcome, but can use AI to create helpful summaries or cluster certain documents together, so that the manual task of preparing a chronology of events or summary of an issue can be streamlined. These approaches make the information easier to review; they do not replace the human review itself.

Lower risk uses of AI in the employment law setting can include using it to summarize application information, using it to collate data amongst multiple applicants, to streamline and template email communications, to improve accessibility for disabled applicants, and to generally create aids for human decisionmakers rather than replacing human analysis and decision-making, and ensure that the employees who use AI to aid employment-related decisions appreciate that their use of AI to reach those decisions still involves a responsibility to understand and scrutinize that information.

Shayda Le is a Partner at Barran Liebman LLP where she advises and represents employers on a wide range of employment issues. For questions, contact Shayda at 503-276-2193 or sle@barran.com.

barran.com

Share.

About Author

Shayda Le is a Partner at Barran Liebman LLP where she advises and represents employers on a wide range of employment issues. For questions, contact Shayda at 503-276-2193 or sle@barran.com.

Comments are closed.