Skip to content

Ability-Based Design Mobile Toolkit: Developer Support for Runtime Interface Adaptation

Despite significant progress, most apps remain oblivious to their users’ abilities. To enable apps to respond to users’ situated abilities, CREATE researchers developed the Ability-Based Design Mobile Toolkit (ABD-MT). ABD-MT integrates with an app’s user input and sensors to observe a user’s touches, gestures, physical activities, and attention at runtime to measure and model these abilities, and to adapt interfaces accordingly. 

With the goal of optimizing user interfaces to better suit users’ abilities, earlier systems attempt to create the “optimal” user interface (UI) for a specific user’s measured abilities. But instead of auto-generating UIs or building custom designs into the UI, ABD-MT lets developers code a set of rules to apply appropriate UI changes in real-time, based on observed user abilities.

Diagram of ABD-MT architecture: Observers capture and record user behaviors; Ability Modeler models and reasons about the user’s abilities; UI Adapter allows developers to specify adaptations.
ABD-MT architecture. (1) The Observers capture and record user touch, gesture, activity, and attention behaviors; (2) The Ability Modeler takes the observed user behavior, and models and reasons about the user’s abilities; (3) The UI Adapter allows developers to specify adaptations based on these abilities.

Examples include UI widgets that respond to tremor, screen layouts that respond to walking and running, text editors that respond to fat fingers, and screen brightness that responds to attention switching.

Example use cases and application screenshots of the the ABD-MT, demonstrating four example user scenarios: UI widgets of a password entry interface that responds to tremor, screen layouts of a Pokemon game that responds to walking, screen layouts of message notifications that responds to running, a text editor that responds to fat fingers, and screen brightness that responds to attention-switching.
Using the Ability-Based Design Mobile Toolkit (ABD-MT), a developer can make their application aware of and responsive to a user’s abilities at runtime. Examples include UI widgets that respond to tremor, screen layouts that respond to walking and running, text editors that respond to fat fingers, and screen brightness that responds to attention switching.

The research team includes CREATE Ph.D. students Junhan (Judy) Kong and Mingyuan Zhong, and CREATE associate directors James Fogarty and Jacob O. Wobbrock. They will present their research at the MobileHCI Conference (the ACM International Conference on Mobile Human-Computer Interaction) in Melbourne Australia in early October.