By Ben Coxworth
October 27, 2023
BrushLens is designed to assist people who would otherwise be unable to utilize touchscreen interfaces Chen Liang, doctoral student, Computer Science and Engineering
For people who are blind or lack fine control of their fingers, touchscreen interfaces such as those used on self-serve kiosks can be virtually impossible to operate. The experimental new BrushLens device, however, utilizes the user’s smartphone to get the job done.
Currently in functional prototype form, BrushLens is being developed at the University of Michigan by a team led by Asst. Prof. Anhong Guo and Assoc. Prof. Alanson Sample.
Essentially a high-tech smartphone case, it leaves the phone’s screen visible and accessible on top, and has a window for the phone’s rear-facing camera to peek through on the bottom. A ring of “autoclickers,” which alter the capacitance of a touchscreen to simulate a finger-touch, surround that window.
The underside of the BrushLens device, with the autoclickers surrounding the phone camera window – another version uses mechanical pushbuttons instead of autoclickers
The underside of the BrushLens device, with the autoclickers surrounding the phone camera window – another version uses mechanical pushbuttons instead of autoclickersChen Liang, doctoral student, Computer Science and Engineering
As a BrushLens-equipped phone is moved across the surface of a large touchscreen, the phone is able to read the displayed text via its camera. An accompanying app speaks out the words as the phone passes over them.
When the user wishes to click on any of the displayed words, they can either tap a large easy-to-hit button on their phone’s app screen, or utilize the phone’s existing screen reader function (in which selections can be made via simple swipes or other gestures). Once the system is developed further, the user could also simply speak a verbal command.
Whichever the case, the BrushLens responds by activating the relevant autoclicker(s), causing a touch to be registered on that part of the underlying touchscreen. Additionally, once the layout of the display on that screen has been mapped, the app can verbally guide the user across it.
It does so by dividing the screen into a grid, and using the phone’s IMU (inertial measurement unit) to track the phone’s position within that grid. By comparing the grid location of the desired onscreen button with the current grid location of the phone, the app is able to tell the user which way to move their phone, right up until it reaches that button.
The BrushLens app guides users across the display by dividing it into a grid
The BrushLens app guides users across the display by dividing it into a gridChen Liang, doctoral student, Computer Science and Engineering
The technology has been tested on 10 volunteers, four of whom were afflicted with tremors or spasms, and six of whom were visually impaired. In all cases, after an initial learning curve, the device improved their ability to use touchscreen displays.
The prototype cost less than US$50 to make, and a refined commercial version should be considerably less. Among other cost- and size-cutting measures, the final version could be powered by the phone’s battery, and it could use the phone to perform all of the data processing.
“So many technologies around us require some assumptions about users’ abilities, but seemingly intuitive interactions can actually be challenging for people,” said doctoral student Chen Liang, first author of a paper on the study. “People have to be able to operate these inaccessible touch screens in the world. Our goal is to make that technology accessible to everyone.”
You can see two versions of the BrushLens device in use, in the video below.
[UIST2023] BrushLens: Hardware Interaction Proxies for Accessible Touchscreen Interface Actuation
Source: University of Michigan
Leave a Reply