iPad Reading Software Refined for Blind and Low Vision Users
Author: Texas A&M University
Published: 2014/12/08 - Updated: 2026/04/22
Publication Type: Research, Study, Analysis
Category Topic: Visual Aids - Related Publications
Contents: Synopsis - Introduction - Main - Insights, Updates
Synopsis: This research, produced by Francis Quek of Texas A&M University and Yasmine N. El-Glaly of Port Said University, presents two key software refinements that significantly improve touch-based digital reading on iPads for people who are blind or have low vision. The work, funded by a $302,000 National Science Foundation grant, addresses long-standing problems with existing reading apps - such as users losing their place on virtual text lines or words being read at a fixed speed regardless of finger movement. The resulting software predicts finger direction, keeps words in sync with reading speed, and alerts users when they stray from a line of text. The paper, "Digital Reading Support for The Blind by Multimodal Interaction," earned an outstanding paper award at the 2014 International Conference for Multimodal Interaction, a highly selective conference with an 18% oral presentation acceptance rate - Disabled World (DW).
- Topic Definition: Touch-Based Assistive Reading Software
Touch-based assistive reading software is a category of application designed for touchscreen devices - most commonly tablets - that enables blind and low vision users to read digital text by dragging a finger across the screen, with the device audibly rendering the corresponding words in real time. Unlike standard screen readers that operate through gestures and menus, these systems create a direct, spatial relationship between a user's physical touch and the text being read, allowing for a more intuitive and continuous reading experience. Effective implementations address common usability challenges such as line-tracking accuracy, reading speed synchronization, and mode-switching confusion, all of which are critical to making digital reading both accessible and practical for people with visual disabilities.
Introduction
Refined Software for Readers with Vision Disabilities
Two key refinements that improve the experience of people who are blind or have low vision who use iPads as touch-based reading devices have been developed by Francis Quek, Texas A&M professor of visualization, and Yasmine N. El-Glaly, assistant professor of computer science at Port Said University in Egypt.
A research paper detailing the refinements, which improve how accurately the software responds to a user's touch, earned an outstanding paper award at the Nov. 12-16, 2014 International Conference for Multimodal Interaction at Bogazici University in Istanbul, Turkey.
Main Content
In their paper, "Digital Reading Support for The Blind by Multimodal Interaction," Quek and El-Glalydescribe how blind or visually impaired readers drag their fingertips along virtual lines of text on the tablet's screen or an overlay to hear the tablet "speak" text of a book or article.
"Existing applications force the user to be slow," said Quek, director of the Texas A&M Embodied Interaction Laboratory. "If the user runs her finger too quickly on the virtual lines of text or changes the software's access mode to read bigger chunks of words, she can easily lose her place or wander between virtual lines of text without realizing it," he said.
Even if existing systems are adjusted to render words faster, he said, interaction problems remain because there is often a poor relationship between the speed of a user's finger on the tablet and the speed of the words pronounced by the device, he said.
To address these issues, Quek and El-Glaly developed software for an iPad that predicts the direction of a user's finger on a tablet overlay, audibly rendering words in sequence then alerting the reader if she strays from the reading line. Their work was supported with a $302,000 grant from the National Science Foundation.
Words in the new software are also rendered in sync with the speed of the user's finger across the tablet screen, not at a default speed set by the application.
In the future, Quek foresees developing note-taking and highlighting capabilities for the software. He will continue to develop the overlay and application in the TEIL with Sharon Lyn Chu, TEIL associate director, and Akash Sahoo, a graduate computer science student.
The Istanbul conference where Quek and El-Glaly present their paper is a global forum for multidisciplinary research on human-human and human-computer interaction, interfaces and system development.
"This is a very selective conference with an 18% acceptance rate for oral presentations," said Quek.
The conference focused on component technologies, theoretical and empirical foundations and combined multimodal processing techniques that define the field of multimodal interaction analysis, interface design, and system development.