Menu

Brain-Computer Interface Controls Gait Using VR Avatar

Author: University of Houston
Published: 2017/08/23 - Updated: 2026/01/24
Publication Details: Peer-Reviewed, Research, Study, Analysis
Category Topic: Electronics - Software - Related Publications

Page Content: Synopsis - Introduction - Main - Insights, Updates

Synopsis: This peer-reviewed research, published in Scientific Reports and funded by the National Institute of Neurological Disease and Stroke, represents the first human study demonstrating that brain-computer interfaces combined with virtual reality avatars can enhance cortical involvement during walking. The work proves particularly valuable for rehabilitation professionals and patients because it shows measurable brain activity changes in regions responsible for motor learning and error monitoring, specifically the posterior parietal cortex and anterior cingulate cortex. Unlike previous animal studies, this human trial used non-invasive EEG technology to decode walking intentions, making the therapy accessible without surgical intervention. People with gait impairments from stroke, certain spinal cord injuries, or other neurological conditions may benefit from this patient-centered approach, which requires active engagement and places users at the center of their own therapy rather than relying on passive treatment methods - Disabled World (DW).

Introduction

Brain-Computer Interface Augmented With Virtual Walking Avatar Can Control Gait

Researchers from the University of Houston have shown for the first time that the use of a brain-computer interface augmented with a virtual walking avatar can control gait, suggesting the protocol may help patients recover the ability to walk after stroke, some spinal cord injuries and certain other gait disabilities.

Researchers said the work, done at the University's Noninvasive Brain-Machine Interface System Laboratory, is the first to demonstrate that a brain-computer interface can promote and enhance cortical involvement during walking. The study, funded by the National Institute of Neurological Disease and Stroke, was published in Scientific Reports.

Main Content

Jose Luis Contreras-Vidal, Cullen professor of electrical and computer engineering at UH and senior author of the paper, said the data will be made available to other researchers. While similar work has been done in other primates, this is the first to involve humans, he said. Contreras-Vidal is also site director of the BRAIN Center (Building Reliable Advances and Innovation in Neurotechnology), a National Science Foundation Industry/University Cooperative Research Center.

Contreras-Vidal and researchers with his lab use non-invasive brain monitoring to determine what parts of the brain are involved in an activity, using that information to create an algorithm, or a brain-machine interface, which can translate the subject's intentions into action.

In addition to Contreras-Vidal, researchers on the project are first author Trieu Phat Luu, a research fellow in neural engineering at UH; Sho Nakagome and Yongtian He, graduate students in the UH Department of Electrical and Computer Engineering.

"Voluntary control of movements is crucial for motor learning and physical rehabilitation," they wrote. "Our results suggest the possible benefits of using a closed-loop EEG-based BCI-VR (brain-computer interface-virtual reality) system in inducing voluntary control of human gait."

Researchers already knew electroencephalogram (EEG) readings of brain activity can distinguish whether a subject is standing still or walking. But they hadn't previously known if a brain-computer interface was practical for helping to promote the ability to walk, or what parts of the brain are relevant to determining gait.

In this case, they collected data from eight healthy subjects, all of whom participated in three trials involving walking on a treadmill while watching an avatar displayed on a monitor. The volunteers were fitted with a 64-channel headset and motion sensors at the hip, knee and ankle joint.

The avatar first was activated by the motion sensors, allowing its movement to precisely mimic that of the test subject. In later tests, the avatar was controlled by the brain-computer interface, meaning the subject controlled the avatar with his or her brain.

The avatar perfectly mimicked the subject's movements when relying upon the sensors, but the match was less precise when the brain-computer interface was used.

Contreras-Vidal said that's to be expected, noting that other studies have shown some initial decoding errors as the subject learns to use the interface.

"It's like learning to use a new tool or sport," he said. "You have to understand how the tool works. The brain needs time to learn that."

The researchers reported increased activity in the posterior parietal cortex and the inferior parietal lobe, along with increased involvement of the anterior cingulate cortex, which is involved in motor learning and error monitoring.

The next step is to use the protocol with patients, the subject of He's Ph.D. dissertation.

"The appeal of brain-machine interface is that it places the user at the center of the therapy," Contreras-Vidal said. "They have to be engaged, because they are in control."

Insights, Analysis, and Developments

Editorial Note: The significance of this brain-computer interface research extends beyond its immediate clinical applications into the realm of neuroplasticity and rehabilitation science. What makes this work particularly noteworthy is the demonstration that the brain can adapt to control external systems with practice, much like mastering any new skill. The initial decoding errors observed when subjects first used the interface aren't failures but rather evidence of the brain's learning process in action. As rehabilitation medicine continues to shift toward patient-driven therapies, this technology offers a blueprint for treatments that harness the brain's natural capacity to reorganize and relearn movement patterns. The researchers' decision to make their data publicly available to other scientists may accelerate development of similar therapeutic protocols, potentially helping thousands of people regain mobility that once seemed permanently lost - Disabled World (DW).

Attribution/Source(s): This peer reviewed publication was selected for publishing by the editors of Disabled World (DW) due to its relevance to the disability community. Originally authored by University of Houston and published on 2017/08/23, this content may have been edited for style, clarity, or brevity.

Related Publications

: Personal review of the Daylight DC-1 tablet with paper-like display, perfect for outdoor use, distraction-free writing, and reducing screen time naturally.

: A study on how imagining pain in VR disrupts body ownership, offering insights into depersonalization and potential clinical applications.

: Cambridge research reveals effective, accessible virtual reality therapy for speech anxiety, showing significant confidence gains in a single session.

Share Page
APA: University of Houston. (2017, August 23 - Last revised: 2026, January 24). Brain-Computer Interface Controls Gait Using VR Avatar. Disabled World (DW). Retrieved February 19, 2026 from www.disabled-world.com/assistivedevices/computer/avatar-gait.php
MLA: University of Houston. "Brain-Computer Interface Controls Gait Using VR Avatar." Disabled World (DW), 23 Aug. 2017, revised 24 Jan. 2026. Web. 19 Feb. 2026. <www.disabled-world.com/assistivedevices/computer/avatar-gait.php>.
Chicago: University of Houston. "Brain-Computer Interface Controls Gait Using VR Avatar." Disabled World (DW). Last modified January 24, 2026. www.disabled-world.com/assistivedevices/computer/avatar-gait.php.

While we strive to provide accurate, up-to-date information, our content is for general informational purposes only. Please consult qualified professionals for advice specific to your situation.