DENVER, CO — The House today passed legislation to establish Colorado’s regulatory framework on automated decision-making technology (ADMT) when it is used to make consequential decisions about an individual.
“AI is everywhere, and Colorado needs appropriate guardrails to prevent discrimination when AI is used to make consequential decisions,” said Assistant Majority Leader Jennifer Bacon, D-Denver. “If someone is denied housing, a job or health care at the hands of automated technology, they deserve to know what criteria went into that decision and to have an opportunity to correct mistakes. The AI taskforce helped us establish the right policy framework for Colorado that protects consumers, prioritizes transparency and does not stifle business growth.”
“Colorado is leading the way nationally with AI guardrails to ensure transparency, accountability and fairness,” said Majority Leader Monica Duran, D-Wheat Ridge. “We did not do this work alone–the AI taskforce brought together a group of stakeholders that laid the groundwork for this legislation. Together, we’re protecting Coloradans from harm and discrimination while fostering a strong business environment. Coloradans deserve to know when automated technology is being used, and they deserve a chance to correct inaccurate information. I’m immensely proud of the work that’s gone into this legislation.”
SB26-189 would update the regulatory framework on ADMT when such technology is used to make consequential decisions. SB26-189 passed the House today by a vote of 57-6.
ADMT is a technology that automatically processes personal data and generates an output used to make, guide, or assist a decision concerning an individual. “Consequential decisions” are defined in the bill as decisions that relate to an individual's access to, eligibility for, or compensation related to education, employment, housing, financial or lending services, insurance, healthcare services, or essential government services.
Protecting Coloradans
To protect consumers, SB26-189 would require deployers, entities that use an ADMT, to provide a clear notice to consumers when they are interacting with an ADMT covered by the bill. If an ADMT makes a consequential decision that results in an adverse outcome for a consumer, the deployer would be required to provide the consumer with a plain-language description of the technology’s role in the decision and a process to request additional information about the decision within 30 days. In the case of an adverse outcome, consumers would have the right to request correction of factually inaccurate personal data and the right to request meaningful human review and reconsideration of the decision.
Implementing and enforcing the new framework
Beginning January 1, 2027, the bill would require ADMT developers to provide a deployer with a description of the technology’s intended uses, categories of data used to train the ADMT, known limitations and risks, and instructions for appropriate use and human review, as well as updates or modifications to the ADMT as they are made.
The legislation requires the Attorney General (AG) to adopt rules that clarify disclosure requirements after an adverse outcome by January 1st, 2027. The AG would have exclusive authority to enforce the bill through the "Colorado Consumer Protection Act" and a violation of the bill would be deemed a deceptive trade practice. In the case of an alleged violation, the AG would be required to provide the developer or deployer with a 60-day notice and an opportunity to cure the violation within that timeframe, if a cure is deemed possible. The bill does not create a new private right of action for an individual.
Ensuring balanced responsibility and liability
Under the bill, both developers and deployers of an ADMT may be held liable for a violation of existing anti-discrimination law, including the Colorado Anti-Discrimination Act (CADA). It further specifies that any fault in a violation of anti-discrimination law should be allocated based on the relative fault shared between the developer and deployer.
The liability section of SB26-189 establishes that provisions in a contract between a developer and deployer that indemnify against any liability under the CADA from either party’s actions are void. The bill is structured to ensure that developers and deployers only have responsibility and liability for the intended use of an ADMT in a consequential decision.
In 2024, lawmakers passed first-of-its-kind legislation to implement consumer protections in interactions with high-risk artificial intelligence systems. Over the past six months, a task force convened by the governor met to develop and publish a new policy framework to strike a balance between the interests of businesses and consumers. SB26-189 would repeal the 2024 legislation and enact many of the recommendations developed by the task force.
.png)