An undergoing project using the blueprints from the OpenBCI headware is moving towards the implementation of an Open VR+BCI system for neurorehabilitation with the use of NeuroRehabLabVR content. The current 3D-printable headset prototype has been modified in order to accommodate the custom EEG boards and electronics of our lab. This hybrid project (OpenBCI+NeuroRehabVR) is going to be launched hopefully within the next months.
Figure 1: 3D printing the first parts (20% density)
Figure 2-3: The first prototype with most of the parts. Top and rear view.
Two papers will be presented at the REHAB 2014 Workshop in PervasiveHealth'14 - 8th International Conference on Pervasive Computing Technologies for Healthcare that will take place at Oldenburg, Germany, 20 May, 2014.
An Assistive Mobile Platform for Delivering Knowledge of Performance Feedback.
Authors:Davide Neves, Athanasios Vourvopoulos, Mónica Cameirão, Sergi Bermudez i Badia
Upper limb motor deficits caused by stroke have a big impact on a person’s daily activities and independence. One strategy for promoting motor relearning consists on the delivery of meaningful feedback during rehabilitative training. In this paper we describe the development and first evaluation of a system that combines a portable arm orthosis device and a mobile application running on a tablet in order to provide knowledge of performance to stroke patients while performing therapy. Here we present preliminary results and discuss the potential of this technology.
Eye Gaze Patterns after Stroke: Correlates of a VR Action Execution and Observation Task.
The concept of a partially shared neural network between action observation and action execution in healthy participants has been demonstrated through a number of studies. However, little research has been done in this regard utilizing eye movement metrics in rehabilitation contexts. In this study we approach action observation and action execution by means of the combination of a virtual environment and eye tracking technology. Participants consisted of stroke survivors, and were required to perform a simple reach-and-grab and place-and-release task with both their paretic and non-paretic arm. Results showed congruency in gaze metrics between action execution and action observation, for distribution and duration of gaze events. However, significant differences in the total number of fixations, saccades and smooth pursuit segments suggest different underlying mechanisms for execution and observation. We therefore extend the understanding of gaze metrics across these conditions in people with deficits derived from stroke.
This week the "RehabNet: A Distributed Architecture for Motor and Cognitive Neuro-Rehabilitation" paper had been presented at the 15th International Conference on E-Health Networking, Application and Services(IEEE Healthcom'13) in Lisbon, getting positive feedback.
Healthcom 2013 is fully sponsored by the IEEE Communications Society and combines elements from both health and technology domains. The Conference Proceedings will be published at IEEEXplore but a copy of the paper can be also found here.
NeuroRehab Lab had the chance to demonstrate the current work in stroke rehabilitation with the use of interactive technologies at the University of Madeira/M-ITI open day.
The demonstration included part of the technology used for motor and cognitive deficits like a gamified version of a cancellation test using hand tracking in a Google TV, Android phone and of-course desktop versions in Windows and MacOS.
In addition, the RehabNet project architecture was introduced including the incorporation of Brain-Computer Interfaces (BCI's) for neurofeedback.
Madeira Interactive Technologies Institute (M-ITI) had the official inauguration of the new facilities at Madeira Tecnopolo building. NeuroRehab lab had the chance to present the undergoing research in Cognitive and Motor rehabilitation and to demonstrate our virtual rehabilitation tools.
Among others our interactive rehabilitation tools had been demonstrated to the vice-president of Madeira.