Skip to content

Access Board’s Preliminary Findings on AI and People with Disabilities

January 21, 2025

This month, the United States Access Board presented its preliminary findings on the risks and benefits of AI for people with disabilities (PWD). The overall goal of these recommendations is to make AI more inclusive, transparent, and responsible, ensuring that people with disabilities are both protected and empowered in AI-driven environments.

The Access Board is an independent federal agency that promotes equality for people with disabilities (PWD), including the development of accessibility guidelines and standards. Created in 1973 to ensure access to federally funded facilities, the Access Board is now a leading source of information on accessible design. In 2023, a presidential executive order tasked the Access Board with examining the safe, secure, and trustworthy development and use of AI.

The United States Access Board logo with a graphic of red and white stripes superimposed over a blue star.

CREATE leadership and other experts consulted

The Access Board consulted with CREATE leadership on best practices and worked with the Center for Democracy and Technology and the American Association of People with Disabilities to create recommendations for federal agencies. They held information sessions and public hearings, and gathered written public comments via the docket on Regulations.gov.

Summary of Access Board recommendations

To minimize bias, include PWD in all stages of AI development

AI systems must be trained using diverse datasets that include people with disabilities. This includes hiring people with disabilities in technology positions and continuously monitoring and assessing AI’s impact on PWD. 

Federal agencies administering benefits programs (like Social Security and Medicare/Medicaid) should engage with PWD to gather input on AI tools, conduct audits, and ensure that AI systems work properly before being fully implemented. A phased approach should be used for deploying algorithmic tools to ensure they are tested and meet the needs of people with disabilities.

Ensure that AI employment tools do not violate the rights of PWD

The Equal Employment Opportunity Commission (EEOC) should update its guidance to protect employees, per the ADA rules. Employers must disclose when AI tools are used, so workers can request accommodations and opt out if necessary. Employers should provide reasonable accommodations for interactions with AI.

Related and supporting research from CREATE

Bias in AI-Enhanced Hiring Systems – a project of the CREATE RERC on real-world impacts on disabled job seekers.

Tracking How People with Disabilities Use GAI Over Time – a CREATE RERC longitudinal study on the most significant ableism, privacy, and security risks created by GAI use.

Not just a wheelchair: Disability representation in AI – an investigation of how AI represents people with disabilities, examining whether AI-produced images and image descriptions perpetuated bias or showed positive portrayals of disability.

CREATE researchers find ChatGPT biased against resumes that imply disability, models improvement – research that studies how GAI can replicate and amplify real-world biases, such as those against disabled people.

More CREATE research on AI and machine language

Privacy and transparency are essential

People need to understand when AI is used, so they can assess its impact and decide whether to continue its use.

Federally funded hospitals should avoid using AI tools that replace human healthcare providers

In critical areas, like health monitoring, AI systems can miss vital information. In particular, AI systems used in healthcare need to be monitored for gender and disability bias.

There is a strong call for federal oversight in AI’s use for benefits administration, especially for programs like Medicaid, which impact millions of people with disabilities. And state agencies must follow federal guidelines and ensure responsible AI procurement to prevent adverse impacts.

Of particular concern: “Bossware”

The Access Board heard particular concerns about employee surveillance tools and how they may not be calibrated for people with disabilities. “Bossware” technologies include devices that measure driving fatigue factors, posture and limb trackers, and wearable technologies such as rings worn by  employees that monitor their movement and other biodata.