🚀 New Holistic AI Tracker 2.0 – Now featuring Atlas, a heat-mapped world view tracking AI legislation, incidents, and regulations in real time.
Register for free
→
Learn more about EU AI Act
Join Webinar: Bias Detection in Large Language Models - Techniques and Best Practices
Register Now
→
Learn more about EU AI Act

Massachusetts HD 3051: An Act Preventing a Dystopian Work Environment

Authored by
Ayesha Gulley
Policy Product Manager at Holistic AI
Ashyana-Jasmine Kachra
Policy Associate at Holistic AI
Published on
Apr 26, 2023
read time
0
min read
share this
Massachusetts HD 3051: An Act Preventing a Dystopian Work Environment

Recent years have seen multiple harms resulting from the mismanagement of artificial intelligence (AI) systems in human resources (HR) practices. Indeed, there are several examples of these systems going wrong, from Amazon’s scrapped resume screening tool that was biased against female applicants, to Workday being sued for alleged discrimination based on race, age, and disability.

In light of these risks, several policymakers in the US have proposed legislation targeting automated decisions, with some of the latest action being taken by the state of Massachusetts with House Docket 3051, an “Act Preventing a Dystopian Work Environment” (HD 3051). With almost identical text to California’s Workplace Technology Accountability Act, HD 3051 seeks to limit workplace monitoring to only essential job functions, give employees more transparency about how their data is used, and reduce actual and potential harm by requiring data protection and algorithmic impact assessments.

In this blog post, we outline what you need to know about this proposed law, focusing on the requirements for automated decision systems and the rise of transparency in the HR sector.

Key definitions

The Act seeks to regulate four distinct systems:  

  1. An Automated Decision System (ADS), or algorithm, is defined as a computational process that is derived from machine learning, statistics, or other data processing or AI techniques that is used to make or assist an employment-related decision. The output of such systems is any information, data, assumptions, predictions, scoring, recommendation, decision, or conclusion that is generated from the ADS.
  1. A Worker Information System (WIS) is a process, automated or not, that involves worker data, including the collection, recording, organization, structuring, storage, alteration, retrieval, consultation, use, sharing, disclosure, dissemination, combination, restriction, erasure, or destruction of worker data. A WIS does not include an ADS.
  1. A Productivity System is a management system that monitors, evaluates, or sets the amount and quality of work done in a set time period by workers.
  1. Electronic Monitoring is defined as the collection of information concerning worker activities or communications by any means other than direct observation, including the use of a computer, telephone, wire, radio, camera, electromagnetic, photoelectronic, or photo-optical system.

Who does Massachusetts HD 3051 apply to?

The Bill applies to workplaces in Massachusetts, which are defined as a “location within Massachusetts at which or from which a worker performs work for an employer”. Employers that operate from a workplace in Massachusetts who collect data about their workers, use electronic monitoring, or use ADS tools to make employment-related decisions about workers, along with vendors acting on their behalf, are within the scope of the legislation. Under the proposed law, an employer is any person who directly or indirectly employs or has control over the wages, benefits, compensation, hours, working conditions, or access to work or any worker, including contractors.

What is required?

Providing Notice – data collection

HD 3051 outlines several requirements for when worker data is collected, stored and used by an employer, the first of which is to provide notice. Employers should include in the notice the type of data being collected, the purpose for collecting data, and how it will be used to make decisions. The notice must also be “clear and conspicuous”; it cannot simply state an ADS has been used.

Within 10 business days of notification to workers, an employer or vendor acting on behalf of an employer must provide notice to the department of Labour & Workforce Development that an ADS has been used.

Providing Notice – Electronic Monitoring

Under HD 3051, employers or vendors acting on behalf of an employer that plan to electronically monitor workers should give notice of their planned activity. As above, such notice must be clear and conspicuous. Notification requirements concerning electronic monitoring include:

  • A description of the purpose and specific activities (locations, communications, and job roles) that will be electronically monitored;
  • A description of the technologies used to conduct the specific form of electronic monitoring and the worker data that will be collected as a part of the electronic monitoring;
  • Whether the data gathered through electronic monitoring will be used to make or inform an employment-related decision, and if so, the nature of that decision, including any associated benchmarks.
  • Whether the data gathered through electronic monitoring will be used to assess workers’ productivity performance or to set productivity standards, and if so, how.
  • The names of any vendors conducting electronic monitoring on the employer’s behalf.
  • A description of a vendor or third party to whom information collected through electronic monitoring will be disclosed or transferred. The description will include the name of the vendor and the purpose for the data transfer.
  • A description of the dates, times, and frequency of electronic monitoring.
  • A description of where the data will be stored and the length it will be retained.

Impact Assessments

Employers that develop, procure, use or implement an ADS or WIS are required to complete an algorithmic impact assessment (AIA) or data protection impact assessment (DPIA), respectively. Under HD 3051, impact assessments must occur before the use of the system, or retroactively for systems in use at the time of the legislation coming into effect and should be conducted by an independent assessor with the relevant experience and understanding of the system.

AIAs aim to evaluate the potential risks posed by an ADS. These include discrimination against protected classes, violations of legal rights, direct or indirect physical or mental harms for algorithmic systems, and privacy harms for worker information and algorithmic systems. Assessors should also identify whether a system could have a chilling effect on workers exercising their legal rights or a negative economic or material impact on workers. Impact assessors must also assess whether a system produces errors (false positives and negatives) and the potential to infringe on the dignity and autonomy of workers.

DPIAs evaluate the potential risks of an WIS. Similar to an AIA, these include discrimination against protected classes, privacy harms such as invasive or offensive surveillance, infringement upon dignity and autonomy of workers, and negative economic impacts. An employer is required to give a description of the methodology used to evaluate the identified risks and recommended mitigation measures.

Both AIAs and DPIAs must be conducted by an independent assessor with relevant experience and should include:

Algorithmic Impact Assessment (AIA) and Data Protection Impact Assessment (DPIA)

What rights do workers have concerning their data?

Requests for Information

Ensuring that workers have access to the data that is held about them and that they can request that any such data be updated is important to upholding transparency. Employers that collect, store, analyze, interpret, or use worker data should provide, upon request and at no additional cost, information to workers in an accessible format. Under the proposed legislation, workers can request information about the types of data an employer has about them, the source of the data, whether it is used as an input or output in an ADS and how it relates to the job function in question. This information should be accurate and kept up to date.

Right to correct

Workers have the right to correct any inaccurate information about worker data that an employer maintains.

Right to access

HD 3051 would require employers to provide, upon receipt of a verifiable request, the categories and pieces of worker data retained, the purpose and sources of data collection, whether the data is related to the worker’s essential job functions or employment decisions, whether the data is involved in an automated decision system and the names of any third parties from whom the data is obtained or to whom the data is disclosed.

Notice

Within 10 business days of notification to workers, an employer or vendor acting on behalf of an employer must provide notice to the department of Labor & Workforce Development that an ADS has been used. Employers must include in the notice the type of data being collected, the purpose for collecting data, and how it will be used to make decisions. In the case of electronic monitoring, workers should be given a clear and conspicuous notice of planned employer activity.

The rise of transparency against “dystopian environments”

The use of AI in hiring and firing decisions is becoming more prevalent. A recent survey of 300 HR leaders at US companies found that 98% of respondents planned to use algorithms to make layoff decisions in 2023, a trend that raises concerns about the potential for opaque metrics to inadvertently harm minority groups. As AI becomes increasingly pervasive in the workplace, the need for transparency has become more pronounced. Without transparency, there is a risk of creating dystopian environments where AI systems make decisions that do not account for individual circumstances. As such, there is a growing push for greater transparency in AI to ensure that it is used ethically, aligned with organizational goals, and does not cause unintended harm to individuals or groups.

AI regulation is ramping up. To prevent liability, businesses must manage AI risks and implement safety measures, like impact assessments. Early action is key to complying with legal requirements and ensuring responsible use of algorithms. To find out more about how Holistic AI can help you with this, get in touch at we@holisticai.com.

DISCLAIMER: This blog article is for informational purposes only. This blog article is not intended to, and does not, provide legal advice or a legal opinion. It is not a do-it-yourself guide to resolving legal issues or handling litigation. This blog article is not a substitute for experienced legal counsel and does not provide legal advice regarding any situation or employer.

Subscriber to our Newsletter
Join our mailing list to receive the latest news and updates.
We’re committed to your privacy. Holistic AI uses this information to contact you about relevant information, news, and services. You may unsubscribe at anytime. Privacy Policy.

See the industry-leading AI governance platform in action

Schedule a call with one of our experts

Get a demo