The New York City Council took decisive action to mandate bias audits of automated employment decision tools (AEDTs) used to evaluate employees for promotion or candidates for employment in New York City, signaling that the risks of Artificial Intelligence (AI) are becoming an increasing regulatory concern. Local Law 144, also known as the NYC Bias Audit Law, is the first of its kind to codify independent, impartial bias audits in law.
As part of this regulation, employers (and employment agencies) are required to make a summary of the results of the bias audit publicly available on their website, increasing the transparency of these systems and allowing applicants to make more informed decisions about their interactions with them. Employers are also required to notify candidates and employees about the use of an AEDT, the characteristics that it will consider, and their instructions on how to request accommodations or an alternative selection procedure. In this blog post, we outline the protected characteristics that must be analysed for the bias audit and what to do if you do not have this data.
Key takeaways:
The legislation states that bias audits should include, at a minimum, testing for disparate impact against component 1 categories required to be reported by employers under subsection (c) of section 2000e-8 of title 42 of the United States Code as specified in part 1602.7 of title 29 of the code of federal regulations. Rules proposed by the Department of Consumer and Worker Protection further clarify this, explicating that the audits, at minimum, should cover each race/ethnicity and sex category that is required to be reported to the Equal Employment Opportunity Commission (EEOC), with an updated version of the rules also specifying that intersectional analysis must be carried out in addition to standalone analysis.
Under the DCWP’s updated rules, race/ethnicity data must be divided into seven categories: Hispanic or Latino, White, Black or African American, Native Hawaiian or Pacific Islander, Asian, Native American or Alaska Native, and two or more races. The Instruction Booklet on Component 1 Data Collection recently published by the EEOC defines each of these categories:
Under the DCWP’s proposed rules, sex data should be based on male/female classifications. However, the EEOC has recently expanded its classification of gender, adding two new categories: unspecified and another gender identity. Making gender reporting more inclusive of those who are non-binary, for example, individuals no longer must restrict their self-reported gender to one of two categories. Thus, the sex/gender categories used for bias audits are male, female, and other.
Due to stringent data protection laws, such as the EU’s Gender Data Protection Regulation (GDPR) or France’s laws that prohibit employers for asking applicants information related to protected attributes, some employers, employment agencies, and vendors may not have the data on protected characteristics required for the bias audit. With the delayed enforcement date, this provides an opportunity to find a means of collecting this data.
For those that are unable to do this, under the revised rules, bias audits are now permitted to be conducted using test or synthetic data. The typical way for such data to be collected is by recruiting participants to be assessed by the AEDT and provide their demographic information, with online panel sites offering a rapid way to collect this data. If test data is used for the bias audit, however, the summary of results must explain why historical, or real-life, data was not used and how the test data was collected.
Originally due to come into effect on 1 January 2023, the enforcement date for this legislation was pushed back to 15 April 2023 due to the large number of public comments that were received during the first public hearing on the Department of Consumer and Worker Protection’s (DCWP) proposed rules, particularly concerning the metrics that should be used to determine bias. Consequently, the DCWP revised the proposed rules and held a second hearing on 23 January 2023. Following this hearing, the final rules were adopted in April and the enforcement date has been postponed to 5 July 2023.
This will give employers that have already had an audit of their system time to collect additional and carry out an additional audit if they have made any changes in light of the results of their initial audit, although their first audit will still be valid (for 1 year after the audit date) when the law does come into effect. For those who have not yet taken steps to procure an audit, the delayed enforcement date will give them the opportunity to collect the necessary data to conduct the audit.
The delayed enforcement date and updates to the rules signal that public concerns about the implementation of this law are being taken seriously by the DCWP and that they will be vigilant when the law does come into effect. Taking steps to prepare early is important for ensuring that you have the necessary data to conduct a bias audit.
Unsure of where to start? Schedule a free consult with one of our experts.
DISCLAIMER: This blog article is for informational purposes only. This blog article is not intended to, and does not, provide legal advice or a legal opinion. It is not a do-it-yourself guide to resolving legal issues or handling litigation. This blog article is not a substitute for experienced legal counsel and does not provide legal advice regarding any situation or employer.
Schedule a call with one of our experts