The Virtual Reality Facilityat Campus Biotech Geneva is part of the Human Neuroscience Platform, and provides researchers with state-of-the-art equipment and expertise in the field of immersive interaction and motion analysis in virtual reality for experimental research and clinical applications (e.g. cognitive and affective assessment, cognitive and behavioral therapy, neurological rehabilitation, gait and upper limb neuro-prostheses).
The Virtual Reality Facility at Campus Biotech Geneva is part of the Human Neuroscience Platform, and provides researchers with state-of-the-art equipment and expertise in the field of immersive interaction and motion analysis in virtual reality for experimental research and clinical applications (e.g. cognitive and affective assessment, cognitive and behavioral therapy, neurological rehabilitation, gait and upper limb neuro-prostheses).
Description du poste
The project is proposed under the scientific supervision of Dr. Ferran Galán (UniGE/Department of Basic Neuroscience, https://www.unige.ch/medecine/neuf/en/), who is working on the development of interfaces and robotic prostheses for paralyzed patients.
It is estimated that there are about 20 million people worldwide with impaired upper-limb function due to spinal cord injury and stroke alone. Following Hebbs’ postulate (neurons that fire together wire together), current neurorehabilitation interventions aim to restore sensorimotor upper-limb function by inducing neural plasticity through contingent sensory feedback to repetitive efferent activity. However, in absence of residual motor output, contingent sensory feedback has been only stablished following exteroceptive brain stimulation or interoceptive brain-computer interface controlled peripheral stimulation, both eliciting a very limited repertoire of efferent activity. There is therefore a clear need for new neurorehabilitation approaches capable of eliciting rich upper-limb sensorimotor activity in absence of residual motor output.
Through evolution, upper-limb and orofacial sensorimotor pathways have closely interacted to coordinate movements with high ethological value; self-feeding first, and combined non-verbal and verbal communication later. Sensorimotor mappings in non-human primates, human children and adults have revealed cortical representations of such orofacial/upper-limb synergies in the precentral gyrus. In addition, transcranial magnetic stimulation (TMS) studies in humans have demonstrated that teeth clenching, non-vocal orofacial movements and speech increase upper-limb motor excitability, providing evidence for cross-sensorimotor interactions between orofacial and upper-limb pathways. Importantly, orofacial sensorimotor pathways are responsible for coordinating highly dexterous behaviors such as expressing emotions, ingestion and producing speech, and they are innervated by cranial nerves which are typically preserved after high cervical spinal cord injury. Altogether, these observations motivate the search for principled interventions interfacing with cross-sensorimotor orofacial/upper-limb pathways, which could promote functional upper-limb recovery after neurological injury.
The proposed approach is an innovative neurotechnological platform, which will translate orofacial sensorimotor activity into upper-limb exoskeleton movements that will enable the performance of 3D pointing and reaching neurorehabilitation tasks in immersive mixed reality (MR) environments. This MixMotion system will integrate the existing xMotion orofacial hands-free interface (UNIGE) and the ALEx upper-limb exoskeleton (EPFL/Wyss) with controlled and naturalistic 3D Augmented Reality environments to convey upper-limb movement agency and sense of embodiment through multimodal cross-sensorimotor integration (see summary slide).
The MSc intern will work at the interface between the FCBG Virtual Reality, Robotics, Haptics and Cognetics Facilities, and will develop 3D pointing and reaching tasks in Augmented Reality (AR), supporting MixMotion’s integration and performing pilot tests.
Milestone 1: Development of first 3D pointing and reaching AR task.
Milestone 2: Integration of xMotion interface and ALEx robot.
Milestone 3: Iterative pilot tests evaluating 3D pointing performance metrics (e.g. % correct trials, trial completion time, Fitt’s law parameters, throughput).
Milestone 4: Final report.
We are looking for excellent candidates with a strong engineering background and interest or initial training in neuroscience/bioengineering. Prior experience in computer graphics / virtual reality / robotics is recommended. This project will involve software development (C++/C#, Unity3D/Unreal engines, 3D computer graphics) and to conduct a behavioral experiment involving signal processing and analysis.
The internship is for MSc level students performing their 6 months final research project in 2019. The position is full-time at FCBG in Campus Biotech.