🚀 New Holistic AI Tracker 2.0 – Now featuring Atlas, a heat-mapped world view tracking AI legislation, incidents, and regulations in real time.
Register for free
Learn more about EU AI Act
Join Webinar: Bias Detection in Large Language Models - Techniques and Best Practices
Register Now
Learn more about EU AI Act

NYC's DCWP Adopts Final Rules on Local Law 144, Effective 5 July 2023

Authored by
No items found.
Published on
Apr 6, 2023
share this
NYC's DCWP Adopts Final Rules on Local Law 144, Effective 5 July 2023

After much anticipation, the New York City’s Department of Consumer and Worker Protection (DCWP) has adopted its final rules on Local Law 144 (Bias Audit Law) and will begin enforcement on 5 July 2023 (initially 1 January 2023 and then initially postponed to 15 April 2023). These new adopted rules reflect issues raised in the public hearing for the second version of the proposed rules and clarify new definitions, make explicit the revised calculation of scores, and establish. In this article, we outline each of the key changes.

Definition of machine learning, statistical modelling, data analytics, or artificial intelligence

In the original statute, an automated employment decision tool (AEDT) was defined as a computational process derived from machine learning, statistical modelling, data analytics, or artificial intelligence that produces a simplified output (a score, classification, or recommendation) used to aid or automate decision-making for employment decisions (screening for promotion or employment). However, in the adopted rules the Department has modified the definition of “machine learning, statistical modelling, data analytics, or artificial intelligence” to expand its scope.

Under the adopted rules, they are defined as mathematical, computer-based techniques that generate a prediction or classification based on skill sets or aptitudes and a computer at least in part identifies the inputs, relative importance of inputs, and other parameters of the model to improve accuracy or prediction.

The requirement for inputs and parameters to refined through cross-validation or by using training and testing data that was added in the second version of the rules has been removed.

Missing data

The adopted rules add a requirement for independent auditors to indicate in the summary of results the number of individuals assessed by the AEDT that are not included in the impact ratio calculations due to missing sex/gender or race/ethnicity data.

Excluding small sample sizes (<2%)

The adopted rules also allow independent auditors to exclude categories that comprise less than 2% of the data in line with the EEOC’s clarifications on the Uniform Guidelines. However, in the event that a category is excluded due to a small sample size, the summary of results must include the independent auditor’s justification for the exclusion, the number of applicants in that category, and the scoring rate or selection rate of that category. Our early concerns were raised in public comment.

Specifying the number of applicants in each category

Another requirement added in the adopted rules is that the summary of results must include the number of applicants in a category for both regression systems and classification systems. In previous versions of the rules, the number of applicants was only shown for classification systems.

Using historical or test data

The final rules specify the circumstances under which an employer or employment agency may utilise a bias audit of an AEDT that is based on either historical or test data. The updated guidelines mandate that historical data may only be employed if the employer or employment agency provides the independent auditor with historical data that was derived from their own use of the AEDT, or if the AEDT has never been used before. Moreover, test data may only be used if no pertinent historical data is available, and the bias audit's summary of results must explain why historical data was not utilised and provide details on how the test data was generated and obtained.

Clarifying the examples of bias audits

Given that small sample sizes can be excluded and there is now a requirement to specify the number of applicants in each category, the adopted rules provide an updated example of the bias audit to reflect these changes.

To find out more about how Holistic AI can help you with a bias audit, get in touch at we@holisticai.com.

Download our comments here

DISCLAIMER: This blog article is for informational purposes only. This blog article is not intended to, and does not, provide legal advice or a legal opinion. It is not a do-it-yourself guide to resolving legal issues or handling litigation. This blog article is not a substitute for experienced legal counsel and does not provide legal advice regarding any situation or employer.

See the industry-leading AI governance platform in action

Schedule a call with one of our experts

Get a demo