Executives who use AI for hiring must 'be vigilant' about reducing bias — or risk landing in hot water, says a labor attorney (2024)

  • New laws require executives to take responsibility for biases in the AI systems they use for hiring.
  • Employers need to stay up to date with the latest legal developments.
  • This article is part of "CXOAIPlaybook" — straight talk from business leaders on how they're testing and using AI.

Executives who use AI for hiring must 'be vigilant' about reducing bias — or risk landing in hot water, says a labor attorney (1)

Sign up to get the inside scoop on today’s biggest stories in markets, tech, and business — delivered daily. Read preview

Executives who use AI for hiring must 'be vigilant' about reducing bias — or risk landing in hot water, says a labor attorney (2)

Thanks for signing up!

Access your favorite topics in a personalized feed while you're on the go.

Executives who use AI for hiring must 'be vigilant' about reducing bias — or risk landing in hot water, says a labor attorney (3)

Employment issues have long been a focus for Amanda Blair, an associate at the large national labor and employment law firm Fisher Phillips in New York City. Recently, she's had to also become an expert on artificial intelligence, data, and analytics.

The move was inspired by a New York City law, which was enacted in 2021 and went into effect in 2023, that requires employers to perform an independent bias audit before using automated employment-decision tools. These tools incorporate AI, algorithms, and other automation technology to screen and evaluate applicants and notify job candidates and employees that the technology is being used.

While AI isn't new, it's becoming more consumer-facing, Blair said.

"It's new for workers. It's new for employers," she added. "That interests me — a new area to explore where the law is going and how it's going to be used in the workplace."

More companies are using AI and automation in hiring and recruitment, which may lead to bias and discrimination.

Business Insider spoke with Blair about the technology and emerging antibias laws and regulations.

The following has been edited for clarity and length.

What should executives know about bias as they incorporate AI hiring tools?

We have laws on the books that address discrimination. Just because you're using an algorithm or automated tool won't keep you from having to abide by those laws. Companies need to know where their data is coming from, what their tool does, and why they're using it.

Do you have data that will actually get you the outcome that you're seeking? Is the tool assisting you in getting the best person for the position without discrimination? What's the source of the tool's training data? For example, if your tool was created using a population of 100 white men, it's not going to be the right one to hire in a city with a majority-minority population.

Executives who use AI for hiring must 'be vigilant' about reducing bias — or risk landing in hot water, says a labor attorney (4)

You also need oversight. We're not at that point where people are just running processes with AI and not looking at them. Hopefully, it stays that way. As people become more comfortable with these tools, that's a concern. You must be vigilant about how you use the technology.

What would you like to see future AI-related antibias laws include?

We have to get everyone up to speed. Not everyone is a mathematician or engineer or knows how large language models work. The individuals using these tools need to know what they're doing so they're not violating any laws.

There needs to be clear definitions; some are still too technical. So what is an automated employment-decision tool? What is artificial intelligence? Clarity in any rules, guidance, and FAQs is key because I think that's going to be one of the biggest barriers to enforcement. Ignorance of the law is not an excuse, but here, a lot of people are ignorant.

How can companies stay on top of the evolving legal landscape of AI in hiring?

AI is not one-size-fits-all. The biggest challenge is relying on a tool that doesn't fit your business. You should address any gaps in understanding of what your tool is doing and why you're using it. Start having conversations to make sure your team understands the role of AI and its implications.

Stay up to date at the city level and with your state legislature. You don't want to be caught off guard if a new law passes. Talk to legal counsel. Develop relationships with vendors to serve as independent auditors, which is required in New York City's law. That may vary by state.

AI is having an impact. Being able to assess the amount of data that we have in society is already revolutionary in what it's going to do for some companies and workers. However, that affects your business. Start to prepare and have those conversations so you're ahead of the game.

Executives who use AI for hiring must 'be vigilant' about reducing bias — or risk landing in hot water, says a labor attorney (2024)

References

Top Articles
Latest Posts
Article information

Author: Merrill Bechtelar CPA

Last Updated:

Views: 6616

Rating: 5 / 5 (70 voted)

Reviews: 93% of readers found this page helpful

Author information

Name: Merrill Bechtelar CPA

Birthday: 1996-05-19

Address: Apt. 114 873 White Lodge, Libbyfurt, CA 93006

Phone: +5983010455207

Job: Legacy Representative

Hobby: Blacksmithing, Urban exploration, Sudoku, Slacklining, Creative writing, Community, Letterboxing

Introduction: My name is Merrill Bechtelar CPA, I am a clean, agreeable, glorious, magnificent, witty, enchanting, comfortable person who loves writing and wants to share my knowledge and understanding with you.