{"id":313,"date":"2018-04-19T16:45:30","date_gmt":"2018-04-19T14:45:30","guid":{"rendered":"http:\/\/members.loria.fr\/LBougrain\/?page_id=313"},"modified":"2026-04-24T10:07:26","modified_gmt":"2026-04-24T08:07:26","slug":"projects","status":"publish","type":"page","link":"https:\/\/members.loria.fr\/LBougrain\/projects\/","title":{"rendered":"Projects"},"content":{"rendered":"<p class=\"p1\"><strong><span class=\"s1\">Project CNRS international collaboration BCI4ALS<\/span><\/strong>\u00a0<span class=\"s1\">2026<\/span><strong><span class=\"s1\"><br \/>\n<\/span><\/strong><\/p>\n<div>Project lead: Laurent Bougrain<\/div>\n<div>Partners: LORIA\/NeuroRhythms, Federal University of Rio Grande do Norte\/Laboratory of Technological Innovation in Health (Brasil)<\/div>\n<div><\/div>\n<div>The funded 1-year CNRS project BCI4ALS aims to evaluate with a group of 12 ALS patients and medical staff from the Laboratory of Technological Innovation in Health (LAIS) of the Federal University of Rio Grande do Norte (UFRN) 1) a prototype of a low-cost brain-computer interface that individuals with ALS can use to communicate daily, we already developed, to enable communication based on gaze direction detection and stationery visual evoked potentials (VEP) and then 2) an extended version using code-VEP we will develop.<\/div>\n<div><\/div>\n<div>&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;<\/div>\n<p class=\"p1\"><strong><span class=\"s1\">Project <a href=\"https:\/\/anr.fr\/Project-ANR-22-CE19-0016\">ANR PRC BCI4IA<\/a><\/span><\/strong> <span class=\"s1\">2022-2027<\/span><strong><span class=\"s1\"><br \/>\n<\/span><\/strong><\/p>\n<div>Project lead: Claude Meistelman<\/div>\n<div>Partners: Nancy university hospital, LORIA\/NeuroRhythms, Inria\/Potioc, Brugmann university (Belgium)<\/div>\n<div><\/div>\n<div>The funded 4-years ANR project BCI4IA aims to design a brain-computer interface for detecting intraoperative awareness during general anesthesia (GA). We observe a combination of markers (relative powers, connectivity &#8230;) under propofol with and without a median nerve stimulation to amplify them and we will design non-existent classification methods by adapting transfer learning in Riemannian geometry for this problem where only the examples of one class are available, that of rest, since we have no examples of motor intention of the patient under GA.<\/div>\n<div><\/div>\n<div>&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;<\/div>\n<p class=\"p1\"><strong><span class=\"s1\">Project <a href=\"http:\/\/graspit.loria.fr\">ANR PRCE Grasp-IT<\/a><\/span><\/strong> <span class=\"s1\">2020-2025<\/span><strong><span class=\"s1\"><br \/>\n<\/span><\/strong><\/p>\n<div>Project lead: Laurent Bougrain<\/div>\n<div>Partners: NeuroRhythms, Perseus, Hybrid, Camin, OpenEdge, univ. hosp. Rennes, univ. hosp. Toulouse<\/div>\n<div><\/div>\n<div>The funded 4-years ANR project GRASP-IT aims to recover upper limb control improving the kinesthetic motor imagery (KMI) generation of post-stroke patients using a tangible and haptic interface within a gamified Brain-Computer Interface (BCI) training environment.<\/div>\n<div><\/div>\n<div>&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;<\/div>\n<div>\n<p class=\"p1\"><strong><span class=\"s1\">Project <a href=\"https:\/\/project.inria.fr\/eashri\/\">Univ Lorraine\/Kyutech (Japan) international collaboration EASHI<\/a><\/span><\/strong>\u00a0<span class=\"s1\">2023-2025<\/span><strong><span class=\"s1\"><br \/>\n<\/span><\/strong><\/p>\n<div>Project leads: Laurent Bougrain \/ Tomoshiro Shibata<\/div>\n<div>Partners: LORIA\/NeuroRhythms, Kyutech\/Department of Human Intelligence Systems (Japan)<\/div>\n<div><\/div>\n<div>The EASHI project focuses on human-robot social interaction (HRSI) and concerns the fields of robotics, neuroscience, psychology, and movement science. It aims to study the cognitive states such as engagement and emotional states of humans who interact with a robot in social games based on different hand gestures with or without physical contact.<\/div>\n<div><\/div>\n<div>&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;<\/div>\n<\/div>\n<div>\n<p class=\"p1\"><strong><span class=\"s1\">Project <a href=\"http:\/\/bci-lift.inria.fr\">Inria Project Lab BCI-LIFT<\/a><\/span><\/strong> <span class=\"s1\">(<\/span><span class=\"s1\">Brain-Computer Interfaces: Learning, Interaction, Feedback, Training) 2014-2018<\/span><strong><span class=\"s1\"><br \/>\n<\/span><\/strong><\/p>\n<div>Project lead: Maureen Clerc<\/div>\n<div>Inria partners: Athena, Camin, Hybrid, Mjolnir, Neurosys, Potioc<\/div>\n<div>External partners: Inserm Lyon, Universit\u00e9 de Rouen<\/div>\n<div><\/div>\n<div>BCI-LIFT is a large-scale 4-year research initiative whose aim is to reach a next generation of non-invasive Brain-Computer Interfaces (BCI), more specifically BCI that are easier to appropriate, more efficient, and suit a larger number of people.<\/div>\n<div><\/div>\n<div>&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;<\/div>\n<\/div>\n<div><strong><strong><span class=\"s1\">Project CNRS PEPS <\/span><\/strong><\/strong><span class=\"s1\">S2IH INS2I 2018 :<\/span> <strong><span class=\"s1\">MoveYouMind<\/span><\/strong> <span class=\"s1\"><span class=\"s1\">(<\/span><\/span>Design and evaluation of <span id=\"result_box\" class=\"\" lang=\"en\"><span class=\"\">a visual neurofeedback based on specific corticomotor areas using source localization for enhancing motor imagery<\/span><\/span><span class=\"s1\">)<br \/>\n<\/span><\/div>\n<div><\/div>\n<div>\n<div>Project lead: Laurent Bougrain<\/div>\n<div>Partners: Neurosys, Cognitive and Systems Neurosciences (Univ Lorraine\/CRAN), Perseus (univ Lorraine)<\/div>\n<div><\/div>\n<div><span id=\"result_box\" class=\"\" lang=\"en\">MOVE YOUR MIND aims at improving the functional recovery protocols of hemiplegic stroke patients by increasing the precision of the identification of the brain areas involved in a kinesthetic motor task of the upper limbs. The brain areas engaged during this rehabilitation task will be detected by specific source localization methods based on the signals obtained by an electroencephalographic acquisition system with variable geometry, which will inform and therefore guide the patient (and the nursing staff) during the functional rehabilitation by indicating to her\/him if the activity which she\/he produces is in the right motor area. <span class=\"\">The project aims to design and evaluate a visual neurofeedback based on active cortical areas within an existing brain-computer interface.<\/span><\/span><\/div>\n<\/div>\n<div>&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;-<\/div>\n<div>\n<div><strong><span class=\"s1\">Project CNRS PEPS <\/span><\/strong><span class=\"s1\">S2IH INS2I <\/span><span class=\"s1\">2018 : <strong>HHH2HRH<\/strong> (From Human-Human Handshaking to Human-Robot Handshaking)<br \/>\n<\/span><\/div>\n<div>\n<div><\/div>\n<div>Project lead: Patrick H\u00e9naff<\/div>\n<div>Partners: Neurosys, , Perseus (univ Lorraine), Cerco, Incia<\/div>\n<\/div>\n<\/div>\n<div>\n<div class=\"page\" title=\"Page 1\">\n<div class=\"layoutArea\">\n<div class=\"column\">\n<p><span class=\"s1\">HHH2HRH<\/span> aims at understanding and modelling physical and cognitive phenomena involved in a common human gesture that must be enough complex from the point of view of behavioral and neuro-sciences, as well as of robotics. We propose to study handshaking between humans (Human\/Human HandShaking, HHH), which is both a common joint action and a multimodal communication between humans, and to reproduce it in its globality with a robot that shakes a human hand (Human\/Robot HandShaking, HRH).<\/p>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n","protected":false},"excerpt":{"rendered":"<p class=\"p1\">Project CNRS international collaboration BCI4ALS\u00a02026\n<\/p>\n<p>Project lead: Laurent Bougrain<br \/>\nPartners: LORIA\/NeuroRhythms, Federal University of Rio Grande do Norte\/Laboratory of Technological Innovation in Health (Brasil)<\/p>\n<p>The funded 1-year CNRS project BCI4ALS aims to evaluate with a group of 12 ALS patients and medical staff from the Laboratory of Technological Innovation in Health (LAIS) of the Federal University of Rio Grande do Norte (UFRN) 1) a prototype of a low-cost brain-computer interface that individuals with ALS can use to communicate daily, we already developed, to enable communication based on gaze direction detection and stationery visual evoked potentials (VEP) and then 2) an extended version using code-VEP we will develop. <\/p>\n","protected":false},"author":136,"featured_media":0,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"footnotes":""},"class_list":["post-313","page","type-page","status-publish","hentry"],"_links":{"self":[{"href":"https:\/\/members.loria.fr\/LBougrain\/wp-json\/wp\/v2\/pages\/313","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/members.loria.fr\/LBougrain\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/members.loria.fr\/LBougrain\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/members.loria.fr\/LBougrain\/wp-json\/wp\/v2\/users\/136"}],"replies":[{"embeddable":true,"href":"https:\/\/members.loria.fr\/LBougrain\/wp-json\/wp\/v2\/comments?post=313"}],"version-history":[{"count":14,"href":"https:\/\/members.loria.fr\/LBougrain\/wp-json\/wp\/v2\/pages\/313\/revisions"}],"predecessor-version":[{"id":448,"href":"https:\/\/members.loria.fr\/LBougrain\/wp-json\/wp\/v2\/pages\/313\/revisions\/448"}],"wp:attachment":[{"href":"https:\/\/members.loria.fr\/LBougrain\/wp-json\/wp\/v2\/media?parent=313"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}