Despite significant progress, most apps remain oblivious to their users’ abilities. To enable apps to respond to users’ situated abilities, CREATE researchers developed the Ability-Based Design Mobile Toolkit (ABD-MT). ABD-MT integrates with an app’s user input and sensors to observe a user’s touches, gestures, physical activities, and attention at runtime to measure and model these abilities, and to adapt interfaces accordingly.
With the goal of optimizing user interfaces to better suit users’ abilities, earlier systems attempt to create the “optimal” user interface (UI) for a specific user’s measured abilities. But instead of auto-generating UIs or building custom designs into the UI, ABD-MT lets developers code a set of rules to apply appropriate UI changes in real-time, based on observed user abilities.
Examples include UI widgets that respond to tremor, screen layouts that respond to walking and running, text editors that respond to fat fingers, and screen brightness that responds to attention switching.
The research team includes CREATE Ph.D. students Junhan (Judy) Kong and Mingyuan Zhong, and CREATE associate directors James Fogarty and Jacob O. Wobbrock. They will present their research at the MobileHCI Conference (the ACM International Conference on Mobile Human-Computer Interaction) in Melbourne Australia in early October.