Mobile 3D printer can autonomously add accessibility features to a room

October 29, 2024

From accessibility upgrades to a custom cat-food bowl, a prototype mobile 3D printer is being used to change the built environment and tailor spaces for peoples’ needs or style preferences.

Built on a modified consumer vacuum robot, MobiPrint can automatically measure a room and print objects onto its floor to add accessibility features, home customizations, or artistic flourishes to the space. The prototype was built by a research team in the Makeability Lab. The team, led by CREATE Ph.D. student Daniel Campos Zamora and CREATE associate director Jon E. Froehlich, customized a graphic interface that lets users design objects that the robot has mapped out.

The team recently presented its work at the ACM Symposium on User Interface Software and Technology in Pittsburgh (UIST). Read their research paper, MobiPrint: A Mobile 3D Printer for Environment-Scale Design and Fabrication.

Pushing 3D printers to do more

Today’s 3D printers make it fairly easy to fabricate a chess set, for example. But these printers are largely fixed in place. So if someone wants to add 3D-printed elements to a room — a footrest beneath a desk, for instance — the project gets more difficult. A space must be measured. The objects must then get scaled, printed elsewhere and fixed in the right spot. Handheld 3D printers exist, but they lack accuracy and come with a learning curve.

“How can we push [digital fabrication] further and further into the world, and lower the barriers for people to use it? How can we change the built environment and tailor spaces for peoples’ specific needs — for accessibility, for taste?”

–Daniel Campos Zamora, lead author and CREATE Ph.D. student in the Allen School

How it works

The prototype system can add accessibility features, such as tactile markers for blind and low-vision people. These might provide information, such as text telling conference attendees where to go, or warn of dangers such as staircases. Or it can create a ramp to cover an uneven flooring transition. MobiPrint also allows users to create custom objects, such as small art pieces up to three inches tall.

Before printing an object, MobiPrint autonomously roams an indoor space and uses LiDAR to map it. The team’s design tool then converts this map into an interactive canvas. The user then can select a model from the MobiPrint library — a cat food bowl, for instance — or upload a design. Next, the user picks a location on the map to print the object, working with the design interface to scale and position the job. Finally, the robot moves to the location and prints the object directly onto the floor.

For printing, the current design uses a bioplastic common in 3D printing called PLA. The researchers are working to have MobiPrint remove objects it’s printed and potentially recycle the plastic. They’re also interested in exploring the possibilities of robots that print on other surfaces (such as tabletops or walls), in other environments (such as outdoors), and with other materials (such as concrete).

“I think about kids out biking or my friends and family members who are in wheelchairs getting to the end of a sidewalk without a curb. It would be so great if in the future we could just send Daniel’s robot down the street and have it build a ramp.”

–Jon E. Froehlich, Director of the Makeability Lab

Photo of Jon Froehlich leaning forward in his seat and smiling effusively. He is a white man with brown hair.

Liang He, an assistant professor at Purdue University, who was a doctoral student in the Allen School while doing this research, is a co-author on this paper.


This article was excerpted from the UW News article by Stefan Milne and the MobiPrint project page.

Hard Mode: Accessibility, Difficulty and Joy for Gamers With Disabilities

Video games often pose accessibility barriers to gamers with disabilities, but there is no standard method for identifying which games have barriers, what those barriers are, and whether and how they can be overcome. CREATE and Allen School Ph.D. student Jesse Martinez has been working to understand the strategies and resources gamers with disabilities regularly use when trying to identify a game to play and the challenges disabled gamers face in this process, with the hopes of advising the games industry on how better support disabled members of their audience.

Martinez, with CREATE associate directors James Fogarty and Jon Froehlich as project advisors and co-authors, published the team’s findings for the ACM CHI conference on Human Factors in Computing Systems (CHI 2024).

Martinez will present the paper, Playing on Hard Mode: Accessibility, Difficulty and Joy in Video Game Adoption for Gamers With Disabilities, virtually at the hybrid conference, and will present it in person at UW DUB’s upcoming para.chi.dub event.

Martinez’s passion for this work came from personal experience: as someone who loves playing all kinds of games, he has spent lots of time designing new ways to play games to make them accessible for himself and other friends with disabilities. He also has experience working independently as a game and puzzle designer and has consulted on accessibility for tabletop gaming studio Exploding Kittens, giving him a unique perspective on how game designers create games and how disabled gamers hack them.

First, understand the game adoption process

The work focuses on the process of “Game Adoption”, which includes everything from learning about a new game (game discovery), learning about it to see if it’s a good fit for one’s taste and access needs (game evaluation), and getting set up with the game and making any modifications necessary to improve the overall experience (game adaptation). As Martinez notes in the paper, gamers with disabilities already do work to make gaming more accessible, so it’s very important not to overlook this work when designing new solutions.

To explore this topic, Martinez interviewed 13 people with a range of disabilities and very different sets of access needs. In the interviews, they discussed what each person’s unique game adoption process looked like, where they encountered challenges, and how they would want to see things change to better support their process.

Graphic from the research paper showing the progression from Discovery (finding a game to play), to Evaluation (assessing a game's fit), to Adaptation (getting set up with a game).

Game discovery

In discussing game discovery, the team found that social relationships and online disabled gaming communities were the most valuable resource for learning about new games. Game announcements often don’t come with promises of being accessible. But if a friend suggested a game, it often meant the friend had already considered whether the game had a chance of being accessible. Participants also mentioned that since there is no equivalent to a games store for accessible games, it was sometimes hard to learn about new games. In their recommendations, Martinez suggests game distributors like Steam and Xbox work to support this type of casual browsing of accessible games.

Game evaluation

In discussing game evaluation, the team found that community-created game videos on platforms like YouTube and Twitch were useful for making accessibility judgments. Interestingly, the videos didn’t need to be accessibility-focused, since just seeing how the game worked was useful information. One participant in the study highlights the accessibility options menus in their Twitch streams, and asks streamers to do likewise, since this information can be tricky to find online.

Game adaptation

Martinez and team discovered many different approaches people took to make a game accessible to them, starting with enabling accessibility features like captions or getting the game to work with their screen reader. Some participants designed their own special tools to make the system work, such as a 3D-printed wrist mount for a gaming mouse. Participants shared that difficulty levels in a game are very important accessibility resources, especially when inaccessibility in the game already made things harder.

The important thing is that players be allowed to choose what challenges they want to face, rather than being forced to play on “hard mode” if they don’t want to.

Other participants discussed how they change their own playstyle to make the game accessible, such as playing as a character who fights with a ranged weapon or who can teleport across parts of the game world. Others went even further, creating their own new objectives in the game that better suited what they wanted from their experience. This included ignoring the competitive part of the racing game Mario Kart to just casually enjoy driving around its intricate worlds, and participating in a friendly roleplaying community in GTA V where they didn’t have to worry about the game’s fast-paced missions and inaccessible challenges.

Overcoming inaccessible games

Martinez uses all this context to introduce two concepts to the world of human-computer interaction and accessibility research: “access difficulty” and “disabled gaming.”

“Access difficulty” is how the authors describe the challenges created in a game specifically due to inaccessibility, which are different from the challenges a game designer intentionally creates to make the game harder. The authors emphasize that the important thing is that players be allowed to choose what challenges they actually want to face, rather than being forced to play on “hard mode” compared to nondisabled players.

“Disabled gaming” acknowledges the particular way gamers with disabilities play games, which is often very different from how nondisabled people play games. Disabled gaming is about taking the game you’re presented and turning it into something fun however you can, regardless of whether that’s what the game designer expects or wants you to do.

Martinez and his co-authors are very excited to share this work with the CREATE community and the world, and they encourage anyone interested in participating in a future study of disabled gaming to join the #study-recruitment channel. If you’re not on CREATE’s Slack, request to join.

ARTennis attempts to help low vision players

December 16, 2023

People with low vision (LV) have had fewer options for physical activity, particularly in competitive sports such as tennis and soccer that involve fast, continuously moving elements such as balls and players. A group of researchers from CREATE associate director Jon E. Froehlich‘s Makeability Lab hopes to overcome this challenge by enabling LV individuals to participate in ball-based sports using real-time computer vision (CV) and wearable augmented reality (AR) headsets. Their initial focus has been on tennis.

The team includes Jaewook Lee (Ph.D. student, UW CSE), Devesh P. Sarda (MS/Ph.D. student, University of Wisconsin), Eujean Lee (Research Assistant, UW Makeability Lab), Amy Seunghyun Lee (BS student, UC Davis), Jun Wang (BS student, UW CSE), Adrian Rodriguez (Ph.D. student, UW HCDE), and Jon Froehlich.

Their paper, Towards Real-time Computer Vision and Augmented Reality to Support Low Vision Sports: A Demonstration of ARTennis was published in the 2023 ACM Symposium on User Interface Software and Technology (UIST).

ARTennis is their prototype system capable of tracking and enhancing the visual saliency of tennis balls from a first-person point-of-view (POV). Recent advancements in deep learning have led to models like TrackNet, a neural network capable of tracking tennis balls in third-person recordings of tennis games that is used to improve sports viewing for LV people. To enhance playability, the team first built a dataset of first-person POV images by having the authors wear an AR headset and play tennis. They then streamed video from a pair of AR glasses to a back-end server, analyzed the frames using a custom-trained deep learning model, and sent back the results for real-time overlaid visualization.

After a brainstorming session with an LV research team member, the team added visualization improvements to enhance the ball’s color contrast and add a crosshair in real-time.

Early evaluations have provided feedback that the prototype could help LV people enjoy ball-based sports but there’s plenty of further work to be done. A larger field-of-view (FOV) and audio cues would improve a player’s ability to track the ball. The weight and bulk of the headset, in addition to its expense are also factors the team expects to improve with time, as Lee noted in an interview on Oregon Public Broadcasting.

“Wearable AR devices such as the Microsoft HoloLens 2 hold immense potential in non-intrusively improving accessibility of everyday tasks. I view AR glasses as a technology that can enable continuous computer vision, which can empower BLV individuals to participate in day-to-day tasks, from sports to cooking. The Makeability Lab team and I hope to continue exploring this space to improve the accessibility of popular sports, such as tennis and basketball.”

Jaewook Lee, Ph.D. student and lead author

Ph.D. student Jaewook Lee presents a research poster, Makeability Lab Demos - GazePointAR & ARTennis.

Off to the Park: A Geospatial Investigation of Adapted Ride-on Car Usage

November 7, 2023

Adapted ride-on cars (ROC) are an affordable, power mobility training tool for young children with disabilities. But weather and adequate drive space create barriers to families’ adoption of their ROC. 

CREATE Ph.D. student Mia E. Hoffman is the lead author on a paper that investigates the relationship between the built environment and ROC usage.

Mia Hoffman smiling into the sun. She has long, blonde hair. Behind her is part of the UW campus with trees and brick buildings.

With her co-advisors Kat Steele and Heather A. Feldner, Jon E. Froehlich (all three CREATE associate directors), and Kyle N. Winfree as co-authors, Hoffman found that play sessions took place more often within the participants’ homes. But when the ROC was used outside, children engaged in longer play sessions, actively drove for a larger portion of the session, and covered greater distances.

Accessibility scores for the sidewalks near a participant’s home on the left and the drive path of the participant on the right. Participant generally avoided streets that were not accessible.

Most notably, they found that children drove more in pedestrian-friendly neighborhoods and when in proximity to accessible paths, demonstrating that providing an accessible place for a child to move, play, and explore is critical in helping a child and family adopt the mobility device into their daily life.

Augmented Reality to Support Accessibility

October 25, 2023

RASSAR – Room Accessibility and Safety Scan in Augmented Reality – is a novel smartphone-based prototype for semi-automatically identifying, categorizing, and localizing indoor accessibility and safety issues. With RASSAR, the user holds out their phone and scans a space. The tool uses LiDAR and camera data, real-time machine learning, and AR to construct a real-time model of the 3D scene, attempts to identify and classify known accessibility and safety issues, and visualizes potential problems overlaid in AR. 

RASSAR researchers envision the tool as an aid in the building and validation of new construction, planning renovations, or updating homes for health concerns, or for telehealth home visits with occupational therapists. UW News interviewed two CREATE Ph.D. students about their work on the project:


Augmented Reality to Support Accessibility

CREATE students Xia Su and Jae Lee, advised by CREATE Associate Director Jon Froehlich in the Makeability Lab, discuss their work using augmented reality to support accessibility. The Allen School Ph.D. students are presenting their work at ASSETS and UIST this year.

Illustration of a user holding a smartphone using the RASSAR prototype app to scan the room for accessibility issues.

Jon Froehlich named Outstanding Faculty Member by the UW College of Engineering

Congrats to CREATE Associate Director Jon Froehlich on being selected for the Outstanding Faculty Award by the UW College of Engineering!

As noted by the College, Froehlich went to extraordinary measures to support his students’ learning during the pandemic. He fundamentally transformed physical computing courses for virtual platforms, assembled and mailed hardware kits to students’ homes, and developed interactive hardware diagrams, tutorials and videos. In addition, Froehlich co-created and led a group of university educators to share best practices for remote teaching of computing lab courses.

Jon Froehlich, CREATE Associate Director and Allen School faculty member

As chair for the conference ASSETS’22, Froehlich has helped ensure the conference is accessible to not only those with physical or sensory disabilities, but for those with chronic illnesses, caretaking responsibilities, or other commitments that prevent physical travel.

In response to the award, Froehlich noted, “I quite literally could not have done this without [CREATE Founding Co-Directors] Jake and Jen’s mentorship and support.”

This article was excerpted from the UW College of Engineering’s CoE Awards announcement.

CREATE faculty and students awarded at ASSETS 2020

Congratulations to UW CREATE faculty on multiple awards at ASSETS 2020, the International ACM SIGACCESS Conference on Computers and Accessibility!

“The University of Washington has been a leader in accessible technology research, design, engineering, and evaluation for years. This latest round of awards from ACM ASSETS is further testament to the great work being done at the UW. Now, with the recent launch of CREATE, our award-winning faculty and students are brought together like never before, and we are already seeing the great things that come of it. Congratulations to all of this year’s winners.” 

— Prof. Jacob O. Wobbrock, Founding Co-Director, UW CREATE

Best student paper:  
Living Disability Theory: Reflections on Access, Research, and Design
Megan Hofmann, Devva Kasnitz, Jennifer Mankoff, Cynthia L Bennett

Best paper:
Input Accessibility: A Large Dataset and Summary Analysis of Age, Motor Ability and Input Performance 
Leah Findlater, Lotus Zhang
Links: gitub code repository


Best artifact:

SoundWatch, as described in the paper Exploring Smartwatch-based Deep Learning Approaches to Support Sound Awareness for Deaf and Hard of Hearing Users 
Dhruv Jain, Hung Ngo, Pratyush Patel, Steven Goodman, Leah Findlater, Jon Froehlich
Links: github code repository | presentation video

Read more

SoundWatch smartwatch app alerts d/Deaf and hard-of-hearing users to sounds

October 28, 2020 | UW News

UW CREATE faculty members Jon Froehlich and Leah Findlater have helped develop a smartwatch app for d/Deaf and hard-of-hearing people who want to be aware of nearby sounds. The smartwatch will identify sounds the user is interested in — such as a siren, a water faucet left on, or a bird chirping — and send the user a friendly buzz along with information.

“This technology provides people with a way to experience sounds that require an action… [and] these devices can also enhance people’s experiences and help them feel more connected to the world,” said lead author Dhruv Jain, a UW doctoral student in the Paul G. Allen School of Computer Science & Engineering.

A wrist with a smartwatch on it. The smartwatch has an alert that says "Car honk, 98%, Loud, 101 dB" It also has options to snooze the alert for 10 minutes or open in an app on the user's phone.
The SoundWatch smartwatch app that identifies nearby sounds and alerts wearers. Jain et al./ASSETS 2020

The team presented their findings Oct. 28 at ACCESS, the ACM conference on computing and accessibility.

Learn more about SoundWatch, the full team and how the smartwarch app evolved from a collection of tablets scattered around a house.

Learn more

UW CREATE leadership at ASSETS 2020

UW CREATE has a large and quality presence at ASSETS 2020, the premier annual conference for accessible computing research. Drawing from three departments, University of Washington authors contributed to six papers and two posters to be presented at this year’s online conference. Three of our papers were nominated for best paper! Seven members also served in conference roles: two on the organizing committee and five on the program committee.

The papers and posters span a variety of topics including input performance evaluation of people with limited mobility, media usage patterns of autistic adults, sound awareness for d/Deaf and hard of hearing people, and autoethnography reports of multiple people with disabilities. Congratulations to the authors and their collaborators!

We look forward to seeing you virtually at ASSETS 2020, which runs October 26 to 28.

A handcarved cane with a spiral design and painted green at the top
An autoethnograher’s daughter’s handcrafted cane, as presented in the paper, “Living disability theory: Reflections on access, research, and design.”
SoundWatch uses smartwatch-based deep learning approaches to support sound awareness for deaf and hard of hearing users.”
The SoundWatch, as described in the paper: “SoundWatch: Exploring smartwatch-based deep learning approaches to support sound awareness for deaf and hard of hearing users.”

Accepted papers

Input accessibility: A large dataset and summary analysis of age, motor ability and input performance

Leah Findlater, University of Washington
Lotus Zhang, University of Washington

The reliability of fitts’s law as a movement model for people with and without limited fine motor function

Ather Sharif, University of Washington
Victoria Pao, University of Washington
Katharina Reinecke, University of Washington
Jacob O. Wobbrock, University of Washington

Lessons learned in designing AI for autistic adults: Designing the video calling for autism prototype

Andrew Begel, Microsoft Research
John Tang, Microsoft Research
Sean Andrist, Microsoft Research
Michael Barnett, Microsoft Research
Tony Carbary, Microsoft Research
Piali Choudhury, Microsoft
Edward Cutrell, Microsoft Research
Alberto Fung, University of Houston
Sasa Junuzovic, Microsoft Research
Daniel McDuff, Microsoft Research
Kael Rowan, Microsoft
Shibashankar Sahoo, UmeŒ Institute Of Design
Jennifer Frances Waldern, Microsoft
Jessica Wolk, Microsoft Research
Hui Zheng, George Mason University
Annuska Zolyomi, University of Washington

SoundWatch: Exploring smartwatch-based deep learning approaches to support sound awareness for deaf and hard of hearing users

Dhruv Jain, University of Washington
Hung Ngo, University of Washington
Pratyush Patel, University of Washington
Steven Goodman, University of Washington
Leah Findlater, University of Washington
Jon E. Froehlich, University of Washington

Living disability theory: Reflections on access, research, and design

Megan Hofmann, Carnegie Mellon University
Devva Kasnitz, Society for Disability Studies
Jennifer Mankoff, University of Washington
Cynthia L Bennett, Carnegie Mellon University

Navigating graduate school with a disability

Dhruv Jain, University of Washington
Venkatesh Potluri, University of Washington
Ather Sharif, University of Washington

Accepted posters

HoloSound: Combining speech and sound identification for Deaf or hard of hearing users on a head-mounted display

Ru Guo, University of Washington
Yiru Yang, University of Washington
Johnson Kuang, University of Washington
Xue Bin, University of Washington
Dhruv Jain, University of Washington
Steven Goodman, University of Washington
Leah Findlater, University of Washington
Jon E. Froehlich, University of Washington

#ActuallyAutistic Sense-making on Twitter

Annuska Zolyomi, University of Washington
Ridley Jones, University of Washington
Tomer Kaftan, University of Washington

Organizing Committee roles

Dhruv Jain as Posters & Demonstrations Co-Chair
Cynthia Bennett as Accessibility Co-Chair

Program committee roles

Cynthia Bennett (recent alumni, now at Apple/CMU) 
Leah Findlater
Jon Froehlich
Richard Ladner
Anne Ross

AccessComputing shares UW CREATE’s launch and work toward accessibility

AccessComputing | July 28, 2020

AccessComputing highlighted several research projects of UW CREATE faculty. An excerpt:

CREATE’s stated mission is “to make technology accessible and to make the world accessible through technology.” CREATE faculty pursue projects along both of these lines. Prof. [Jacob] Wobbrock was part of a team that helped make touch screens accessible by inventing Slide Rule, the world’s first finger-driven screen reader, in 2007. A research team including Profs. Richard Ladner, James Fogarty, and Wobbrock created GestureCalc, an eyes-free calculator for touch screens.

Prof. Jon Froehlich has created Project Sidewalk to use crowdsourcing and machine learning to gather and present outdoor navigation information, particularly the accessibility of sidewalks. Dr. Anat Caspi has a similar project called AccessMap, which provides personalized automated pedestrian routing.

Prof. Jennifer Mankoff conducts research on consumer-grade fabrication technology, such as low-cost 3D printing, and how this technology can be used to meet do-it-yourself or do-for-others accessibility challenges.

Professor Heather Feldner enables children with disabilities to explore the physical world through creative mobility support in her Go Baby Go project. [Kat Steele’s Open-Orthoses projects work with individuals with disabilities to co-design customized devices, rigorously test the devices, and provide open-source designs that accelerate development.]

For these and many other projects, CREATE faculty are already internationally recognized for their contributions to assistive technology and accessible computing; by bringing them together under one organizational roof, CREATE will enable synergies and foster collaborations that enable faculty and students to become more than the sum of their parts.

Read the full article

Can Project Sidewalk Use Crowdsourcing to Help Seattleites Get Around?

July 23, 2019 | SeattleMet

With the goal of making navigating our streets safer and easier for the mobility impaired, Jon Froehlich’s Project Sidewalk turns mapping sidewalks and improving pedestrian accessibility into a virtual game. To complete missions, users “walk” through city streets via Google Street View, labeling and rating the quality of sidewalks and features that make it easier—or tougher—to get around. They identify curb ramps, or lack thereof, assess their positioning, and point out tripping hazards.

Since Froehlich launched Project Sidewalk in Seattle in April 2019, users have mapped roughly a third of the city’s 2,300 miles of sidewalks and labeled nearly 70,000 curb ramps, uneven surfaces, potential obstacles like lamp posts, too.

Read the full SeattleMet article.

Four CREATE faculty receive Google Research Awards

UW News | March 16, 2020

Four UW CREATE faculty have been named recipients of Google Faculty Research Awards. The grants, among 150 Google recently announced, support world-class technical research in computer science, engineering and related fields. Each award provides funding to support one graduate student for a year.

The recipients are Jennifer MankoffJames Fogarty and Jon Froelich of the Paul G. Allen School of Computer Science & Engineering and Leah Findlater of the Department of Human Centered Design & Engineering.

The goal of the awards is “to identify and strengthen long-term collaborative relationships with faculty working on problems that will impact how future generations use technology,” according to Google.