Palm-sized Robot Navigates Touchscreens for Visually Impaired Users
Having to tap-tap-tap away to order a sandwich or check bags at a touchscreen kiosk can be annoying. For those who are blind or visually impaired, it can be excruciating.
Soon, instead of friends, family or even strangers bridging this technological chasm, a palm-sized portable robot created by a team of University of Maryland researchers could help.
The device, called Toucha11y, can attach itself to a touchscreen, communicating with a user’s phone and allowing access to features like voiceover and zoom to read the touchscreen's options, with the robot then pushing the appropriate buttons when asked to.
The UMD team consists of third-year computer science doctoral student Jiasheng Li; fourth-year computer science doctoral student Zeyu Yan; their adviser and Assistant Professor of Computer Science Huaishu Peng; alumni Arush Shah, who graduated in 2021 with their B.S. in computer engineering; and Jonathan Lazar, a professor in the College of Information Studies.
Federal requirements that public kiosks include accessibility features haven’t solved the problem for many users, explains Peng, who also has an appointment in the University of Maryland Institute for Advanced Computer Studies. Even ones that are technically compliant lack standardized features, blind testers have told Peng.
“We thought, if there’s a universal way for [visually impaired] people to use these machines, we could lower the learning curve a bit,” he says.
The current prototype looks like a tiny measuring tape with a camera, computer and suction cups. It takes photos to orient itself on the screen, then uses an extendable “tape” fitted with a touch probe to press buttons. The research team tested it on screens up to 37 inches wide and on six different interfaces, with users being able to complete tasks like ordering a bubble tea, specifying ice and sugar levels, in under 90 seconds.
In future iterations, the team plans to shrink Toucha11y (which combines “touch” and “accessibility”—with the numeral “11” representing the letters between “a” and “y” in “accessibility”) to make it easier to carry in a purse or pocket, as well as add real-time menu-scanning capabilities.
Earlier this year, the team won an Honorable Mention Award for their paper on Toucha11y at the Association for Computing Machinery’s Conference on Human Factors in Computing Systems (CHI) in Hamburg, Germany. The conference is widely regarded as the largest and most recognized in computer science, bringing together leading researchers, practitioners and experts from around the globe.
"The fact that Jiasheng's work received an Honorable Mention award at a conference that’s known for its rigorous review process, speaks volumes about the innovation and impact of his
research," says Peng, adding how it was “awe-inspiring to see how Toucha11y stood out among a sea of exceptional papers.”
Li agrees that the recognition not only validates the project, but also highlights a growing acknowledgment of the importance of accessibility technologies among a wider audience.
“This award motivates me to continue exploring innovative solutions that promote inclusivity and equality, and create novel tools to help those who are often overlooked,” he says.
Toucha11y is part of the Maryland Initiative for Digital Accessibility, launched in July 2023, which brings together researchers across UMD, disability rights groups and tech companies to make technology accessible for all.
Yet this isn’t the team’s first go at developing a digital accessibility device since the initiative. Prior work involving digital accessibility by Peng, Li and Yan includes TangibleGrid, a physical baseboard with brackets that allows a visually impaired person to design a website in real time with their hands.
—This article was adapted from news releases published by Maryland Today and the Department of Computer Science.