Projects

Project CNRS international collaboration BCI4ALS 2026

Project lead: Laurent Bougrain
Partners: LORIA/NeuroRhythms, Federal University of Rio Grande do Norte/Laboratory of Technological Innovation in Health (Brasil)
The funded 1-year CNRS project BCI4ALS aims to evaluate with a group of 12 ALS patients and medical staff from the Laboratory of Technological Innovation in Health (LAIS) of the Federal University of Rio Grande do Norte (UFRN) 1) a prototype of a low-cost brain-computer interface that individuals with ALS can use to communicate daily, we already developed, to enable communication based on gaze direction detection and stationery visual evoked potentials (VEP) and then 2) an extended version using code-VEP we will develop.
—————————————————————

Project ANR PRC BCI4IA 2022-2027

Project lead: Claude Meistelman
Partners: Nancy university hospital, LORIA/NeuroRhythms, Inria/Potioc, Brugmann university (Belgium)
The funded 4-years ANR project BCI4IA aims to design a brain-computer interface for detecting intraoperative awareness during general anesthesia (GA). We observe a combination of markers (relative powers, connectivity …) under propofol with and without a median nerve stimulation to amplify them and we will design non-existent classification methods by adapting transfer learning in Riemannian geometry for this problem where only the examples of one class are available, that of rest, since we have no examples of motor intention of the patient under GA.
—————————————————————

Project ANR PRCE Grasp-IT 2020-2025

Project lead: Laurent Bougrain
Partners: NeuroRhythms, Perseus, Hybrid, Camin, OpenEdge, univ. hosp. Rennes, univ. hosp. Toulouse
The funded 4-years ANR project GRASP-IT aims to recover upper limb control improving the kinesthetic motor imagery (KMI) generation of post-stroke patients using a tangible and haptic interface within a gamified Brain-Computer Interface (BCI) training environment.
—————————————————————

Project Univ Lorraine/Kyutech (Japan) international collaboration EASHI 2023-2025

Project leads: Laurent Bougrain / Tomoshiro Shibata
Partners: LORIA/NeuroRhythms, Kyutech/Department of Human Intelligence Systems (Japan)
The EASHI project focuses on human-robot social interaction (HRSI) and concerns the fields of robotics, neuroscience, psychology, and movement science. It aims to study the cognitive states such as engagement and emotional states of humans who interact with a robot in social games based on different hand gestures with or without physical contact.
—————————————————————

Project Inria Project Lab BCI-LIFT (Brain-Computer Interfaces: Learning, Interaction, Feedback, Training) 2014-2018

Project lead: Maureen Clerc
Inria partners: Athena, Camin, Hybrid, Mjolnir, Neurosys, Potioc
External partners: Inserm Lyon, Université de Rouen
BCI-LIFT is a large-scale 4-year research initiative whose aim is to reach a next generation of non-invasive Brain-Computer Interfaces (BCI), more specifically BCI that are easier to appropriate, more efficient, and suit a larger number of people.
—————————————————————
Project CNRS PEPS S2IH INS2I 2018 : MoveYouMind (Design and evaluation of a visual neurofeedback based on specific corticomotor areas using source localization for enhancing motor imagery)
Project lead: Laurent Bougrain
Partners: Neurosys, Cognitive and Systems Neurosciences (Univ Lorraine/CRAN), Perseus (univ Lorraine)
MOVE YOUR MIND aims at improving the functional recovery protocols of hemiplegic stroke patients by increasing the precision of the identification of the brain areas involved in a kinesthetic motor task of the upper limbs. The brain areas engaged during this rehabilitation task will be detected by specific source localization methods based on the signals obtained by an electroencephalographic acquisition system with variable geometry, which will inform and therefore guide the patient (and the nursing staff) during the functional rehabilitation by indicating to her/him if the activity which she/he produces is in the right motor area. The project aims to design and evaluate a visual neurofeedback based on active cortical areas within an existing brain-computer interface.
——————————————————————-
Project CNRS PEPS S2IH INS2I 2018 : HHH2HRH (From Human-Human Handshaking to Human-Robot Handshaking)
Project lead: Patrick Hénaff
Partners: Neurosys, , Perseus (univ Lorraine), Cerco, Incia

HHH2HRH aims at understanding and modelling physical and cognitive phenomena involved in a common human gesture that must be enough complex from the point of view of behavioral and neuro-sciences, as well as of robotics. We propose to study handshaking between humans (Human/Human HandShaking, HHH), which is both a common joint action and a multimodal communication between humans, and to reproduce it in its globality with a robot that shakes a human hand (Human/Robot HandShaking, HRH).