HR Technology

Case Study: Colorado Passes Law to Regulate AI Use in Consequential Decision Making

On May 17, 2024, the Colorado Legislature passed Senate Bill (SB) 24-205 to protect employees and consumers by prohibiting developers of high-risk artificial intelligence (AI) systems from engaging in “algorithmic discrimination” in consequential decision-making. This prohibition also applies to organizations that deploy these high-risk AI systems. Governor Jared Polis signed the bill into law on May 17, 2024.

Defining Terms

Algorithmic discrimination occurs when AI treats individuals differently in an unlawful manner based on their membership in a protected class, such as religion, sex, or race. A high-risk AI system is defined as any system that makes or plays a substantial role in making a consequential decision, which is defined as “a decision that has a material legal or similarly significant effect on the provision or denial to any consumer, or the cost or terms of” the following services:

  • Educational enrollment or opportunity;
  • Employment or employment opportunities;
  • Financial or lending service;
  • Essential government services;
  • Health care services;
  • Housing;
  • Insurance; and
  • Legal services.

However, an AI system is not considered high-risk if it is intended to perform narrow procedural tasks or track decision-making patterns or changes in those patterns. The statute also lists several technologies that do not qualify as high-risk AI systems—so long as those technologies don’t make or play a substantial role in making a consequential decision—including anti-malware, anti-virus, cybersecurity, networking, and firewall technologies.

Developers—including any organization that substantially modifies an AI system—must use reasonable care to prevent known or reasonably foreseeable risks of algorithmic discrimination arising from the use of high-risk AI systems. There’s a rebuttable presumption of compliance if a developer provides deployers with certain documentation, including:

  • A general statement of foreseeable risks and known harmful or inappropriate uses of the high-risk AI system;
  • Documentation disclosing a summary of the data used to train the AI and the known or reasonably foreseeable risks, limitations, and intended benefits of the high-risk AI system;
  • The purpose of the high-risk AI system;
  • Measures taken to mitigate the risk of algorithmic discrimination; and
  • Any other documentation necessary to assist deployers in complying with the statute.

Deployers of these technologies may take advantage of a rebuttable presumption if they demonstrate compliance with the following requirements, among others:

  • Implement a risk management policy and program to assess and regularly review the possibility of algorithmic discrimination;
  • Complete an impact assessment for the use of the high-risk AI system; and
  • Provide a notice to all applicants, clients, or consumers that lets them know a high-risk AI system is being used to make or play a substantial role in making a consequential decision, the purpose of the system, the contact information of the deployer, and information regarding the consumer’s ability to opt out of the processing of personal data concerning the consumer for purposes of profiling in furtherance of decisions that produce legal or similarly significant effects.

Both developers and deployers must report the actual occurrence or likelihood of the occurrence of algorithmic discrimination to the Colorado Attorney General within 90 days of discovery. Only the Colorado Attorney General can file legal action against a developer or deployer under the statute. There is no private right of action under the statute.

Not all deployers are subject to the statute’s requirements. For example, organizations with fewer than 50 full-time employees are exempt if they disclose the intended use of AI to consumers, among other requirements that apply to both the deployer and the high-risk AI system.

Takeaway

The law will take effect on February 1, 2026. The Colorado Attorney General may promulgate rules under the statute, which may change or expand the obligations of developers and deployers imposed by the law.

We will continue to monitor for developments in this space. Please contact your regular lawyer or one of the listed authors for questions specific to your organization.

Kelley Rowan, Cody M. Barela, and Gregory J. Ramos are attorneys with Armstrong Teasdale, and can be reached at krowan@atllp.comcbarela@atllp.com, and gramos@atllp.com.

Leave a Reply

Your email address will not be published. Required fields are marked *