CREATE Submits RFI on Disability Bias in Biometrics

CREATE, in collaboration with representatives of the Trace Center people with backgrounds in computer science policy and disability studies, submitted a response to the Science and Technology Policy Office‘s request for “Information on Public and Private Sector Uses of Biometric Technologies“. See our full response on Arxiv. We summarize the request for information, and our response, below.

What are biometric technologies?

Biometric technologies are computer programs that try to guess something about you. Sensors today can capture your heart rate, fingerprint, face. They can watch you walk, and even find identifying information in your typing style and speed. Initially, biometrics were designed to identify people. This has expanded to include emotion, intent, disability status, and much more.

What does the Office of Science and Technology Policy (OSTP) want to know?

The OSTP wants to hear from many different types of people who may have very different wishes for, and worries about, biometric technology. For example, their Request for Information mentions industry, advocacy groups, tribal governments, academic researchers, and the general public.

They are interested in how biometrics may be used and what their benefits are. But they also want to know how to use them well. For example, they ask how biometrics can be more likely to correctly identify a person, or their emotion, intent and so on. They also ask about security concerns — could one person using a biometric system pretend to be someone they are not, for example. They ask about other possible harms such as whether biometrics work equally well for all people, or whether they might be used for surveillance, harassment, or other worrisome activities.

They finally ask about appropriate governance. Governance refers to rules that might increase the value and safety of biometrics. An example is who should be included in ethical decisions about biometric use. Rules about how it is ok to use biometrics, or ways of preventing problematic use are also on this list. Transparency and whether biometrics can be used in court are also mentioned.

What did CREATE have to say about this?

CREATE led a discussion of disability bias and risk in the use of biometric technology.

Ableist assumptions

The benefits of such technologies are similar for people with and without impairments, however access to such technologies is important for equitable use. Ableist assumptions built into an application can make it inaccessible even if it meets legal standards. For example, an automatic door may close too fast for some users, or a voice menu may time out. These inaccessibilities are avoidable if systems are designed with disabled users in mind.

Biased data

Biometric systems require data (many examples of whatever information they are using, to guess things about people). If that data is biased (for example, lacks examples from people with disabilities), biometrics are likely to be far less accurate in their guesses for those populations. A person might have unusual or missing  limbs and not have a fingerprint, or walk differently, or speak differently than the system expects, and thus be unable to access services tied to recognition of fingerprints, gait, or voice. This can also make biometrics inaccessible. The CREATE response discusses several examples of how these biases can creep into data sets.

Lack of personal agency and privacy

Next, the risks of biometric failures can be higher for people with disabilities. For example, if a disability is rare, this can make data security more difficult, or make it more likely that different people with similar impairments are confused for each other. In addition, it is possible that biometrics might be used to label someone as disabled without their permission, an abuse of personal agency and privacy. Also, a biometric system may implicitly enforce what it means to be “human” when they fail to recognize a disabled body and then deny access to services as a result.

What solutions did CREATE recommend?

These problems are difficult, but not impossible, to solve.

Include people with disabilities in the design

CREATE’s first recommendation is to ensure that people with disabilities are given the chance to help in the design and assessment of biometric systems. Participatory design, which includes people with disabilities as important stakeholders in the design process, is a good first step. However true equity will require that people with disabilities can enter the technology workforce so that they can directly build and innovate such systems. This requires access to higher education programs; access to conferences and events where research and products are discussed, presented and shared; and accessible tools for programming biometrics. In addition, the disability community needs to get involved in policy and decision making around biometrics.

Set standards for algorithm accessibility

Next, we need new standards for algorithm accessibility, just as we have standards for web page accessibility. These should include expectations about testing with disabled populations, and collecting unbiased data.

Ensure transparency, responsiveness and permission

Additionally, there should be rules about transparency, the ability to override or replace biometric systems when they fail to correctly understand a disabled person’s actions or input, and rules about not abusing biometrics by, for example, detecting disability without permission.

Education: Accessibility and Race

Our Fall CREATE Accessibility Seminar focused on the intersection of Race and Accessibility. This topic was chosen both for its timeliness and also as part of CREATE’s commitment to ensure that our work is inclusive, starting with educating ourselves about the role of race in disability research and the gaps that exist in the field.

  • A search of the ACM digital library for papers that used words like “race” “disability” and “Black” turned up extremely few results. Even when papers talk about both disability and race, they are often treated separately. For example, some provide information on what percentage of a certain group is in various categories without considering their intersection. A rare exception is author Dr. Christina Harrington, who has directly spoken to this intersection and was kind enough to make a guest appearance at our seminar.

Although we know this is only the first step in our journey toward racial justice, we learned some important things along the way.

“By the end of the seminar, we were sure of one thing only: This is a topic we could not do justice to in a single quarter. There is much more to uncover here, and much work to be done.”

student Momona Yamagami

  • The research topics we found, which included work on both disability and race-related factors, were more wide-ranging than disability alone, including transportation, e-government access, hate speech, policing, surveillance, and institutionalization.
  • Guest researchers joined in to share their expertise including Dr. Christina N. Harrington, from DePaul University, on community-based approaches to reconsidering design for marginalized populations; Dr. Karin D. Martin from UW’s Evans School of Public Policy and Governance, a crime policy specialist whose areas of expertise are monetary sanctions, racial disparities in the criminal justice system, and decision-making in the criminal justice context; and Dr. Shari Trewin, IBM Accessibility Manager and Research Lead, on bias in artificial intelligence.

There is an important and growing body of critical literature on the topic. To touch on just a few of the books we read when preparing for the seminar, see DisCrit: Critical conversations across race, class, & dis/ability (Connor et al, 2016), Disability incarcerated (Moshe et al, 2014), and Disability Visibility (Wong, 2020).

“I appreciated the opportunity to talk about the intersection of accessibility and race because although we talk a lot about accessibility in this research area, we don’t really talk about how race and its intersection with other minority identities plays a huge role in who gets access and for whom technologies are made,” said student Momona Yamagami. “By the end of the seminar, we were sure of one thing only: This is a topic we could not do justice to in a single quarter. There is much more to uncover here, and much work to be done.”

Learn more about the accessibility seminar

Selected readings

Highlights from the full reading list:

Jennifer Mankoff, Founding Co-Director

My research focuses on accessibility and 3D printing.  I have led the effort to better understand both clinical and DIY stakeholders in this process, and developed better, more usable tools for production. Together, these can enhance the capabilities and participation of all users in today’s  manufacturing revolution.

Affiliations:

Richard E. Ladner Professor, Paul G. Allen School of Computer Science & Engineering

Director, Make4all Lab

Research highlights

Better data sets that capture the varied experience of people with disabilities

Better data sets that capture the varied experience of people with disabilities are crucial to building better accessibility solutions. Mankoff has been involved in multiple pioneering data collection efforts. Most recently, her work capturing fine-grained, longitudinal behavioral data about the experiences of college undergraduates with and without disabilities has allowed her to study the unequal impacts of COVID-19’s changes to society on students with disabilities. She has also collected, and is currently exploring the first data set containing fine-grained end-to-end trip data about over 60 people with disabilities, combined with self reports of successes and failures. In the past, she collected over a year of real-world mouse data from individuals with various impairments, a data set whose size is unparalleled in a community that usually tests ideas on 1-10 individuals in lab settings. With this data, she was able to pioneer pixel based analysis methods that could improve on standard accessibility APIs, achieving a shift from 75% to 89% in accuracy identifying on-screen targets; demonstrate the huge variability within a single user and among many users with impairments that affect desktop computer use; and develop classifiers that could dynamically determine a user’s pointing ability with 92% accuracy on a single sample.

Better understanding of clinical and DIY accessible technology production

The advent of consumer-grade fabrication technology, most notably low-cost 3D printing, has opened the door to increasing power and participation in do-it-yourself and do-for-others accessible technology production. However, such production faces challenges not only at the level of process and policy, but with respect to materials, design tools, and follow-up. As summarized in a 2019 Communications of The ACM article, Mankoff has led the effort to better understand both clinical and DIY stakeholders in this process, and developed better, more usable tools for production. Together, these can enhance the capabilities and participation of all users in today’s  manufacturing revolution.

AccessSIGCHI directorship

Mankoff is the long-time director of AccessSIGCHI, the national group that has helped to improve conference accessibility in one of ACM’s largest professional groups, and is working collaboratively to help set standards and document best practices for use across ACM.


Related news