10 Things You Need to Know About the NYC Mandatory Bias Audits [July 2023]

July 10, 2023
Authored by
Airlie Hilliard
Senior Researcher at Holistic AI
10 Things You Need to Know About the NYC Mandatory Bias Audits [July 2023]

To address some of the concerns about the use of automated employment decision tools (AEDTs) in making employment decisions, particularly in relation to the risk of discriminatory outcomes, the New York City Council has taken decisive action and passed legislation that mandates bias audits of these tools. With the original legislation due to take effect on 1 January 2023, enforcement was initially delayed to 15 April 2023 due to the high volume of comments received during the first public hearing on the Department of Consumer and Worker Protection’s (DCWP’s) proposed rules. Following the proposal of a second version of the proposed rules and accordingly held a second public hearing, the DCWP released the final version of its adopted rules and announced a final enforcement date of 5 July 2023. At the end of June, a few business days before the final enforcement of the rules, the DCWP published FAQS to provide additional clarifications to the law, along with slides from roundtables it held with business advocates/employers, civil/worker rights advocates, and industry stakeholders at the end of May.

NYC Bias Audit FAQs - What you need to know

In this blog post, we outline the 10 key things you need to know about the NYC bias audit law and the proposed, updated, and adopted rules, as well as clarifications from the FAQs.

{{NYC}}

1. What is a bias audit?

An impartial evaluation of an automated employment decision tool carried out by an independent auditor that should include (but is not limited to) assessing for disparate impact against category 1 protected characteristics (race/ethnicity and sex/gender at minimum). Employers must provide a summary of this audit on their website if using automated employment decision tools to assess candidates residing in New York City and must inform them of the key features of the automated tool before using it.

The first version of the proposed rules specify that bias should be determined using impact ratios based on subgroup selection rate (% of individuals in the subgroup that are hired), subgroup average score, or both. Ratios are calculated by dividing the subgroup average score/selection rate by the average score/selection rate of the  group with the highest score/rate:


Selection rate for a category

Selection rate of the most selected category
or
Average score of individuals in a category

Average score of individuals in the highest scoring category

However, the updated rules provide a revised calculation for calculating impact ratios for AEDTs that result in a continuous score, where scores are first binarized using a pass/fail criteria depending on whether scores are above or below the median score of the sample, termed the scoring rate:


Scoring rate for a category

Scoring rate for the highest scoring category

The updated rules also clarify that when historical data, or real-life hiring data, is deemed by the auditor to be insufficient to conduct a statistically significant bias audit, the audit can be conducted using test data, providing that the reason for doing so is included in the summary of results. Where historical data is available, it can be sourced from any employers or employment agencies that use the AEDT, if indeed it is used by multiple entities. However, a bias audit based on historical data from another entity may only be relied on if the employer or employment agency has provided historical data to the independent auditor or if it has never used the AEDT. This data does not need to be for the same type of position as that included in the historical data. If such historical data is limited to a specific region or time period, the audit should explain why.

Further, the proposed rules explicate that bias audits must be conducted in relation to the following categories:

  • Sex categories (male/female)
  • Race/Ethnicity categories (Hispanic or Latino/White/Black or African American/ Native Hawaiian or Pacific Islander/Asian/Native American or Alaska Native/Two or more races)
  • Intersectional categories of sex and ethnicity or race

This data cannot be inferred or computed for the purpose of the bias audit; it must be self-reported.  

The bias audit requirements outlined in the adopted rules are consistent with the second version of the proposed rules. However, a new addition to the adopted rules states that auditors can exclude groups that represent less than 2% of the sample from the analysis, providing they justify the exclusion and include the number of applicants in that category and the scoring rate or selection rate of that category in the results.

While Local Law 144 sets out bias audit requirements, it does not require any actions to be taken if the audit indicates bias. However, it is important to note that federal, state, and New York City laws prohibit discrimination based on protected characteristics that employers/employment agencies must comply with. Therefore, the bias audit may have implications under other laws.  

2. Who qualifies as an independent auditor?

While the initial legislation did not explicate who is considered an independent author, the first version of the proposed rules clarified that an independent auditor is a person or group that was not involved in using or developing the AEDT. However, amid concerns that this could lead to employers or employment agencies conducting internal audits of their tools, where teams not involved in the use or development of the tool would conduct the audit, the updated rules make it clear that audits should be conducted by a third party.

Giving a more comprehensive definition, the second and adopted versions of the rules state that an independent auditor should exercise objective and impartial judgement and that they are not considered to be independent if:

  • They were involved in using, developing, or distributing the AEDT
  • If, during the audit, they have an employment relationship with the employer or employment agency that seeks to (continue to) use the AEDT or with a vendor that develops or distributes the AEEDT
  • If, during the audit, they have a (direct or material indirect) financial interest in an employer or employment agency that seeks to (continue to) use the AEDT or in a vendor that developed or distributed the AEDT.

3. What is an automated employment decision tool?

A computational process, derived from machine learning, statistical modelling, data analytics, or artificial intelligence that produces a simplified output (a score, classification, or recommendation) used to aid or automate decision making for employment decisions (screening for promotion or employment).

The proposed rules clarify that machine learning, statistical modelling, data analytics, or artificial intelligence are a group of computer-based mathematical techniques that generate a prediction of a candidate’s fit, likelihood of success or classification based on skills/aptitude. The inputs, predictor importance, and parameters of the model are identified by a computer to improve model accuracy or performance and are refined through cross-validation or by using a train/test split. They also clarify that a simplified output includes ranking systems.

Mostly consistent with the second version of the rules, the adopted rules define machine learning, statistical modelling, data analytics, and artificial intelligence as mathematical, computer-based techniques that generate a prediction or classification based on skill sets or aptitudes and a computer at least in part identifies the inputs, relative importance of inputs, and other parameters of the model to improve accuracy or prediction. Therefore, the requirement for these systems to be refined through cross-validation or using a train/test split has been removed from the adopted rules.

The FAQs further clarify that AEDTs are computer-based tools that i) use machine learning, statistical modelling, data analytics, or artificial intelligence; and ii) help employers and employment agencies make decisions at any point in the hiring or promotion process; and iii) substantially assist or replace discretionary decision making. Specifically, tools that use machine learning, statistical modelling, data analytics, or artificial intelligence i) generate a prediction or classification and identify the inputs, relative importance of the identified inputs and other parameters to improve the accuracy of the generated prediction or classification. Here, a prediction is an assessment of a candidate’s fit for the job or likelihood of success and a classification is an assignment of an observation to a group including categorisations based on skills or aptitude.

4. What are some examples of an automated employment decision tool?

Video interviews, game-based/image-based assessments, and resume screening tools etc. that are scored or evaluated by an algorithm. Systems that rank candidates on their suitability for a position or how well they meet some criteria are also considered automated employment decision tools.

5. What documentation do employers have to provide?

Employers using an automated employment decision tool must provide a summary of the results of a current bias audit (< 1 year old) on their website or the website of the Employment agency before using the tool.

The first version of the proposed rules clarify that this summary should appear the careers or jobs section of their website in a clear and conspicuous manner and should include the date of the most recent bias audit of such AEDT, the distribution date of the AEDT to which such bias audit applies, and a summary of the results (including selection rates and impact ratios for all categories).

The updated rules further clarify that the summary of results should include the source and explanation of the data used for the bias audits, and the adopted rules state that the summary of results must include the number of applicants in a category for both regression systems and classification systems as well as the number of individuals assessed by the AEDT that are not included in the impact ratio calculations due to missing sex/gender or race/ethnicity data.

6. What are the notification requirements of the legislation?

At least ten working days before the tool is used, candidates must be informed that an automated employment decision tool is being used to assess them and allow them to request an accommodation or alternative selection process. The characteristics that are being used to make the judgments and the source and type of data being used within 30 days of written request. If it is not available on the website of the employer or the Employment Agency.

The first version of the proposed bias audit rules clarify that the notice can be given by including it in a job posting or by sending it through U.S. mail or e-mail. For employees specifically, notice can also be given in a written policy or procedure that is provided to employees, and for candidates, the notice can be included on the careers or jobs section of its website.

The updated rules specify that the notice must also include instructions for how candidates or employees can request accommodations or alternative selection processes under other laws, if available. The updated rules also specify that employers and employment agencies must provide information about its AEDT data retention policy on the employment section of its website, along with information about the type and source of data collected by the AEDT. Instructions on how to make a request for such information should also be posted, and responses should be issued within 30 days, including any explanations of why providing such information would violate local, state, or federal laws or interfere with a law enforcement investigation.

The FAQs clarify that employers and employment agencies can begin using an AEDT 10 business days after posting notice on their website regardless of when the specific job listing was posted.

{{NYC}}

7. Who does the legislation apply to?

Employers using automated employment decision tools to evaluate candidates or employees who reside in New York City for a position or promotion.

The FAQs clarify that Local Law 144 applies to only employers and employment agencies using an AEDT in the city, meaning that the job location is in a NYC office (at least part time) or is fully remote but is associated with a NYC office. This applies to employment agencies using AEDTs if they are located in NYC or if they are located outside of NYC but one of the previous is true. This does not apply to those using an AEDT to scan a resume bank, reach out to potential candidates, or invite applications.

If the law applies to an employer or employment agency, then they must conduct the bias audit prior to using the AEDT and any job candidate that is a New York City resident must be given notice of the use of the tool, where a candidate is defined as a person that has applied for a specific position by submitting the necessary information or items in the required format.

It is important to note that while vendors can procure bias audits for their tools on behalf of their clients, it is ultimately the employer or employment agency using the AEDT that is responsible for compliance.

8. Are there penalties for noncompliance?

Up to $500 for the first violation and each additional violation occurring on the same day. Subsequent violations incur penalties of $500 - $1500.

Complaints for violations can be made by contacting 331 or visiting the DCWP website. Such complaints must include the details of the job posting or position, the name and type of AEDT, any notice provided, and an explanation of suspected violations. While the DCWP will enforce complaints made about bias audits or notification, claims of discrimination will be referred to the NYC Commission on Human Rights, which enforces the NYC Human Rights Law.

9. Does this affect the civil rights of candidates?

The subchapter should not be construed to limit the rights of any candidate orf employee for an employment decision to bring civil action. Therefore, candidates’ civil rights are not affected and other relevant equal employment laws must still be followed by the employer.

The proposed rules clarify that nothing in the legislation requires employers to comply with requests for alternative procedures or accommodations, but these practices may be covered by other legislation (e.g., Americans with Disabilities Act; ADA).

10. When does the legislation come into effect?

The law was initially due to come into effect on 1 January 2023 but was then postponed to 15 April 2023 before being postponed a final time to 5 July 2023. From then, the NYC Bias Audit law will make it unlawful for employers to use an automated employment decision tool without a bias audit to screen candidates or employees residing in New York City.

To find out more about how Holistic AI can help you prepare for this and other upcoming legislation, schedule a call with our expert team.

Last updated 10 July 2023

Request a NYC Bias Audit Preview

Achieve compliance with the help of an industry-leading independent auditor.

Learn More

DISCLAIMER: This blog article is for informational purposes only. This blog article is not intended to, and does not, provide legal advice or a legal opinion. It is not a do-it-yourself guide to resolving legal issues or handling litigation. This blog article is not a substitute for experienced legal counsel and does not provide legal advice regarding any situation or employer.

Subscriber to our Newsletter
Join our mailing list to receive the latest news and updates.
We’re committed to your privacy. Holistic AI uses this information to contact you about relevant information, news, and services. You may unsubscribe at anytime. Privacy Policy.

Discover how we can help your company

Schedule a call with one of our experts

Schedule a call