Skip to content
Candid photo of James Fogarty smiling. He is a white man with, in this photo, neatly combed short brown hair.

James Fogarty, Associate Director

My broad research interests are in Human-Computer Interaction, User Interface Software and Technology, and Ubiquitous Computing. My focus is on developing, deploying, and evaluating new approaches to the human obstacles surrounding widespread everyday adoption of ubiquitous sensing and intelligent computing technologies.

Affiliations

Professor, Allen School of Computer Science & Engineering

Research highlights

Large-Scale Android Accessibility Analyses

Fogarty’s research group is leading the largest-known open analyses of the accessibility of Android apps, thus providing new understanding of the current state of mobile accessibility and new insights into factors in the ecosystem that contribute to accessibility failures (ASSETS 2017, ASSETS 2018, TACCESS 2020). For example, our analyses found that 45% of apps are missing screenreader labels for more than 90% of their image-based buttons, leaving much of the functionality of those apps inaccessible to many people. Such results also highlight that pervasive accessibility failures require continued research and new approaches to addressing contributing factors in the technology ecosystem. Our analyses of common failure scenarios has directly led to Google improvements in the accessibility ecosystem (e.g., corrections to Android documentation code snippets that were inaccessible, thus creating many accessibility failures as such snippets were used in apps) and motivated additional research (e.g., our ongoing work on developer tools that better scaffold developer learning about how to correctly apply accessibility metadata).

Runtime Mobile Accessibility Repair and Enhancement

Fogarty’s research group is developing new techniques for runtime repair and enhancement of mobile accessibility. Key to these approaches is a new ability to support third-party runtime enhancements within Android’s security model and without requiring modification to apps (CHI 2017, UIST 2018). We have applied these approaches to accessibility repair (e.g., techniques to allow social annotation of apps with missing screenreader data), but also to enable entirely new forms of tactile accessibility enhancements (ASSETS 2018). These techniques therefore provide a research basis for both improving current accessibility and exploring new forms of future accessibility enhancements.


Related news