Connecticut lawmakers have taken decisive action and proposed a new bill in the General Law Committee that would establish an Office of Artificial Intelligence and create a government task force to develop an AI Bill of Rights. These efforts are focused on regulating the use of AI by State Agencies.
Introduced by State Senator James Maroney, the Bill, SB 1103, would allow for government oversight, mandate the inventory and testing of state-used algorithms, close existing data privacy loopholes and enumerate citizen protections through an AI Bill of Rights. The extensive application of algorithms in government operations is not exclusive to Connecticut, and thus, it is of the utmost importance to generate solutions to safeguard people from potential harm and allow them to comprehend the repercussions of automated systems use.
This article gives a brief description of the Bill, its significant points, and emphasises the necessity for governments to stay current with AI regulations in order to guarantee that automated systems do not adversely affect individuals.
From assigning students to schools, to allocating state resources, to setting bail, government authorities are increasingly using algorithms to automate key processes. In an attempt to regulate the use of AI by State Agencies, Connecticut lawmakers have introduced SB 1103, An Act Concerning Artificial Intelligence, Automated Decision-Making and Personal Data Privacy. The Bill seeks to:
It also establishes the Automated Systems Procedures, which outlines the protocols and processes for developing, procuring, and implementing automated decision systems or automated final decision systems. Finally, it outlines the requirements for critical decisions, including notification to an individual as to the algorithm used and the right to appeal. Critical decisions here refer to education, employment, essential utilities, family planning, financial services, credit and mortgage services, healthcare, housing, legal services, government benefits, and public services.
Effective 1 July 2023, any "State agency" (department, board, commission, council, institution, office, constituent unit of the state system of higher education, technical education and career school or other agency in the executive, legislative or judicial branch of state government) would be required to:
The procedures must ensure that the automated systems comply with all applicable laws prohibiting discrimination and addressing privacy, civil rights, and civil liberties; do not disproportionately impact any individual or group based on any differentiating characteristic; and are safe, secure, and resilient.
SB 1103 requires the Office of Artificial Intelligence to review and inventory all automated systems developed, used, or procured by state agencies during the calendar year beginning 1 January 2023. This inventory will include the name of such automated system, a description of its capabilities, the data used, purpose and intended use of the system, how data was processed and stored, and the financial impact of the system. Moreover, the automated system must disclose where it discriminated against any individual or group of individuals in violation of state or federal law. The review will also determine if the system infringed any legal rights of Connecticut residents or posed any risk to the state.
In addition, the Bill requires state agencies developing, utilizing, or procuring any AI system after 1 January 2024 to provide the Office of Artificial Intelligence with at least sixty days of advance written notice. Following notice, the Office will review the automated system to determine if it would result in any discrimination or disproportionate impact prohibited by state or federal laws and share the outcome with the relevant state agency. On or after 1 July 2025, the Office may also periodically re-evaluate automated systems to ensure compliance with the automated system procedures.
If passed, SB 1103 would require the Artificial Intelligence Officer to prepare and submit a report to the joint standing committee of the General Assembly relating to consumer protection. The report would include details on automated system procedures and updates, legislation recommendations, information on automated systems used by state agencies, and any other relevant information determined by the Artificial Intelligence Officer. The Bill would take effect 1 July 2023 and the report must be submitted by 15 February 2025 and annually thereafter.
Connecticut's government has seen a rapid, largely unchecked spread of algorithm use. According to a white paper from the Yale Law School, it is difficult for the state to hold algorithms accountable. The Department of Children and Families (DCF) refused to provide the source code for an algorithm intended to reduce the risk of children experiencing a life-threatening episode, citing trade secret protection. Although the system had been used for three years, the DCF had not evaluated it for efficacy or bias. The Department of Education (DOE) also refused to release the source code of an algorithm used to assign students to schools, indicating no effort had been made to assess it for efficacy or bias.
Despite the support for SB 1103, there have been critiques raised in the public hearing and in written testimony, such as doubts about whether the Bill goes beyond what is feasible at the moment, the need for standards prior to implementing legislation, and whether a task force is essential.
Not unique to the State, without proper testing and ongoing evaluation, algorithms can function improperly or perpetuate historic biases reflected in the algorithm’s code or, for a machine-learning algorithm, embedded in the data used to train it. To promote transparency and accountability, state agencies must develop and procure automated systems that have been tested and validated to ensure they are functioning as intended.
Taking steps early is the best way to get ahead of AI regulations. At Holistic AI, we have a team of experts who, informed by relevant policies, can help you keep track and manage the risks of your AI. Reach out to us at we@holisticai.com to learn more about how we can help.
DISCLAIMER: This blog article is for informational purposes only. This blog article is not intended to, and does not, provide legal advice or a legal opinion. It is not a do-it-yourself guide to resolving legal issues or handling litigation. This blog article is not a substitute for experienced legal counsel and does not provide legal advice regarding any situation or employer.
Schedule a call with one of our experts