On 19 September, the New York City (NYC) Department of Consumer and Worker Protection (DCWP) published proposed amendments to Local Law 144, which mandates independent bias audits of ‘automated employment decision tools’ used by employers or employment agencies.
The law was enacted by the NYC Council in November 2021. It implicates any U.S. company which uses AI tools to screen, hire or promote candidates or employees residing in NYC. Read our FAQs on the law to learn more.
The DCWP’s proposed amendments have not yet been enacted and are open for consultation until 24 October, when there will be a public hearing on the matter. Information about the amendments and how to submit comments and participate in the hearing can be found here.
The proposed amendments and consultation demonstrate that the NYC authorities are prioritising the implementation and enforcement of this legislation.
1st January 2023 remains the date the law takes effect. (Update: Local Law 144 has now been pushed back to 5 July 2023). The use of an automated employment decision tool is prohibited after this date, unless the tool has been independently audited for bias.
The DCWP’s amendments provide practical guidance for businesses and new compliance requirements. The key changes being proposed are:
The legislation requires that bias audits must be impartial evaluations conducted by an independent auditor.
The proposed amendments further clarify that an ‘independent auditor’ must be a group or person that is not involved in using or developing an automated employment decision tool.
This means that a tech vendor that develops and supplies an automated tool to an employer cannot conduct the bias audit themselves.
Despite the employer or employment agency being ultimately liable, vendors who supply them will likely benefit from commissioning bias audits of their automated tools. Employers may also insist that vendors commission bias audits prior to purchasing a tool, given their new legal obligations.
The legislation defines ‘automated employment decision tools’ as those which “substantially assist or replace discretionary decision making” about hiring candidates or promoting employees.
The proposed amendments further clarify that this only covers situations where:
An automated tool’s outputs are defined as a prediction or classification, which can be expressed as a score, tag, categorisation or ranking.
The legislation already specifies that a bias audit must include the testing of an automated tool to determine whether it adversely affects one category (i.e., race/ethnicity or sex) over another.
The proposed amendments provide additional information on exactly how this should be done.
The bias audit must include a calculation of the ‘selection rate’ and ‘impact ratio’ for different races/ethnicities and sexes.
The ‘selection rate’ is the proportion of individuals in a category who are selected to move forward in the hiring or promotion process.
To determine whether a system results in disparate impact, auditors must also calculate the impact ratio of a system, which can be measured in two ways:
In the bias audit examples described by the DCWP in the proposed amendments, the calculations are also made for intersectional groupings (i.e., selection rates and impact ratios are provided for different races/ethnicities according to their sex). This intersectionality enables deeper insight into the outcomes of different sub-groups.
Despite this, the proposed amendments do not appear to mandate calculations on intersectional groupings.
However, as these rules are explicitly labelled as the minimum requirements for a bias audit, businesses who go above and beyond, and take a more granular and in-depth approach, will be viewed more favourably by the enforcement authorities, courts and other stakeholders.
Under the new proposals, after a bias audit is conducted, employers and employment agencies will have to publish:
This information must be visibly posted and easily accessible on the organisation’s website for at least 6 months after the relevant tool was last used. If the tool continues to be used, the summary of results must continue to be publicly available.
The requirement to publish such granular detail about the employment outcomes for different demographic categories exposes employers to damaging reputational and legal risks.
The legislation also requires employers or employment agencies to give at least ten business days’ notice to candidates or employees prior to using an automated tool to make decisions about hiring or promotion. No exceptions or derogations are given to this requirement.
The legislation already stipulates that the notice must include the job qualifications and characteristics being assessed by the automated tool.
The proposed amendments further clarify how this notice should be given to candidates or employees. It gives employers three options:
In the 10-day period, individuals can request the use of an alternative selection method or another accommodation. The proposed amendments clarify that the notice must include information about how to make this request.
However, the amendments also explicitly state that employers and employment agencies are not required to provide an alternative selection process or accommodation.
It is therefore up to employers whether to change their processes for candidates and employees who do not wish to be assessed by an automated tool.
DISCLAIMER: This blog article is for informational purposes only. This blog article is not intended to, and does not, provide legal advice or a legal opinion. It is not a do-it-yourself guide to resolving legal issues or handling litigation. This blog article is not a substitute for experienced legal counsel and does not provide legal advice regarding any situation or employer.
Schedule a call with one of our experts