January 21, 2025
This month, the United States Access Board presented its preliminary findings on the risks and benefits of AI for people with disabilities (PWD). The overall goal of these recommendations is to make AI more inclusive, transparent, and responsible, ensuring that people with disabilities are both protected and empowered in AI-driven environments.
The Access Board is an independent federal agency that promotes equality for people with disabilities (PWD), including the development of accessibility guidelines and standards. Created in 1973 to ensure access to federally funded facilities, the Access Board is now a leading source of information on accessible design. In 2023, a presidential executive order tasked the Access Board with examining the safe, secure, and trustworthy development and use of AI.
CREATE leadership and other experts consulted
The Access Board consulted with CREATE leadership on best practices and worked with the Center for Democracy and Technology and the American Association of People with Disabilities to create recommendations for federal agencies. They held information sessions and public hearings, and gathered written public comments via the docket on Regulations.gov.
Summary of Access Board recommendations
To minimize bias, include PWD in all stages of AI development
AI systems must be trained using diverse datasets that include people with disabilities. This includes hiring people with disabilities in technology positions and continuously monitoring and assessing AI’s impact on PWD.
Federal agencies administering benefits programs (like Social Security and Medicare/Medicaid) should engage with PWD to gather input on AI tools, conduct audits, and ensure that AI systems work properly before being fully implemented. A phased approach should be used for deploying algorithmic tools to ensure they are tested and meet the needs of people with disabilities.
Ensure that AI employment tools do not violate the rights of PWD
The Equal Employment Opportunity Commission (EEOC) should update its guidance to protect employees, per the ADA rules. Employers must disclose when AI tools are used, so workers can request accommodations and opt out if necessary. Employers should provide reasonable accommodations for interactions with AI.
Related and supporting research from CREATE
Bias in AI-Enhanced Hiring Systems – a project of the CREATE RERC on real-world impacts on disabled job seekers.
Tracking How People with Disabilities Use GAI Over Time – a CREATE RERC longitudinal study on the most significant ableism, privacy, and security risks created by GAI use.
Not just a wheelchair: Disability representation in AI – an investigation of how AI represents people with disabilities, examining whether AI-produced images and image descriptions perpetuated bias or showed positive portrayals of disability.
CREATE researchers find ChatGPT biased against resumes that imply disability, models improvement – research that studies how GAI can replicate and amplify real-world biases, such as those against disabled people.