{"id":313,"date":"2018-04-19T16:45:30","date_gmt":"2018-04-19T14:45:30","guid":{"rendered":"http:\/\/members.loria.fr\/LBougrain\/?page_id=313"},"modified":"2021-05-06T22:44:42","modified_gmt":"2021-05-06T20:44:42","slug":"projects","status":"publish","type":"page","link":"https:\/\/members.loria.fr\/LBougrain\/projects\/","title":{"rendered":"Projects"},"content":{"rendered":"<p class=\"p1\"><strong><span class=\"s1\">Project <a href=\"http:\/\/graspit.loria.fr\">ANR PRCE Grasp-IT<\/a><\/span><\/strong> <span class=\"s1\">2020-2024<\/span><strong><span class=\"s1\"><br \/>\n<\/span><\/strong><\/p>\n<div>Project lead: Laurent Bougrain<\/div>\n<div>Partners: Perseus, Hybrid, Camin, OpenEdge, univ. hosp. Rennes, univ. hosp. Toulouse<\/div>\n<div><\/div>\n<div>The funded 4-years ANR project GRASP-IT aims to recover upper limb control improving the kinesthetic motor imagery (KMI) generation of post-stroke patients using a tangible and haptic interface within a gamified Brain-Computer Interface (BCI) training environment.<\/div>\n<div><\/div>\n<div>&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;<\/div>\n<div>\n<p class=\"p1\"><strong><span class=\"s1\">Project <a href=\"http:\/\/bci-lift.inria.fr\">Inria Project Lab BCI-LIFT<\/a><\/span><\/strong> <span class=\"s1\">(<\/span><span class=\"s1\">Brain-Computer Interfaces: Learning, Interaction, Feedback, Training) 2014-2018<\/span><strong><span class=\"s1\"><br \/>\n<\/span><\/strong><\/p>\n<div>Project lead: Maureen Clerc<\/div>\n<div>Inria partners: Athena, Camin, Hybrid, Mjolnir, Neurosys, Potioc<\/div>\n<div>External partners: Inserm Lyon, Universit\u00e9 de Rouen<\/div>\n<div><\/div>\n<div>BCI-LIFT is a large-scale 4-year research initiative whose aim is to reach a next generation of non-invasive Brain-Computer Interfaces (BCI), more specifically BCI that are easier to appropriate, more efficient, and suit a larger number of people.<\/div>\n<div><\/div>\n<div>&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;<\/div>\n<\/div>\n<div><strong><strong><span class=\"s1\">Project CNRS PEPS <\/span><\/strong><\/strong><span class=\"s1\">S2IH INS2I 2018 :<\/span> <strong><span class=\"s1\">MoveYouMind<\/span><\/strong> <span class=\"s1\"><span class=\"s1\">(<\/span><\/span>Design and evaluation of <span id=\"result_box\" class=\"\" lang=\"en\"><span class=\"\">a visual neurofeedback based on specific corticomotor areas using source localization for enhancing motor imagery<\/span><\/span><span class=\"s1\">)<br \/>\n<\/span><\/div>\n<div><\/div>\n<div>\n<div>Project lead: Laurent Bougrain<\/div>\n<div>Partners: Neurosys, Cognitive and Systems Neurosciences (Univ Lorraine\/CRAN), Perseus (univ Lorraine)<\/div>\n<div><\/div>\n<div><span id=\"result_box\" class=\"\" lang=\"en\">MOVE YOUR MIND aims at improving the functional recovery protocols of hemiplegic stroke patients by increasing the precision of the identification of the brain areas involved in a kinesthetic motor task of the upper limbs. The brain areas engaged during this rehabilitation task will be detected by specific source localization methods based on the signals obtained by an electroencephalographic acquisition system with variable geometry, which will inform and therefore guide the patient (and the nursing staff) during the functional rehabilitation by indicating to her\/him if the activity which she\/he produces is in the right motor area. <span class=\"\">The project aims to design and evaluate a visual neurofeedback based on active cortical areas within an existing brain-computer interface.<\/span><\/span><\/div>\n<\/div>\n<div>&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;-<\/div>\n<div>\n<div><strong><span class=\"s1\">Project CNRS PEPS <\/span><\/strong><span class=\"s1\">S2IH INS2I <\/span><span class=\"s1\">2018 : <strong>HHH2HRH<\/strong> (From Human-Human Handshaking to Human-Robot Handshaking)<br \/>\n<\/span><\/div>\n<div>\n<div><\/div>\n<div>Project lead: Patrick H\u00e9naff<\/div>\n<div>Partners: Neurosys, , Perseus (univ Lorraine), Cerco, Incia<\/div>\n<\/div>\n<\/div>\n<div>\n<div class=\"page\" title=\"Page 1\">\n<div class=\"layoutArea\">\n<div class=\"column\">\n<p><span class=\"s1\">HHH2HRH<\/span> aims at understanding and modelling physical and cognitive phenomena involved in a common human gesture that must be enough complex from the point of view of behavioral and neuro-sciences, as well as of robotics. We propose to study handshaking between humans (Human\/Human HandShaking, HHH), which is both a common joint action and a multimodal communication between humans, and to reproduce it in its globality with a robot that shakes a human hand (Human\/Robot HandShaking, HRH).<\/p>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n","protected":false},"excerpt":{"rendered":"<p class=\"p1\">Project <a href=\"http:\/\/graspit.loria.fr\">ANR PRCE Grasp-IT<\/a> 2020-2024\n<\/p>\n<p>Project lead: Laurent Bougrain<br \/>\nPartners: Perseus, Hybrid, Camin, OpenEdge, univ. hosp. Rennes, univ. hosp. Toulouse<\/p>\n<p>The funded 4-years ANR project GRASP-IT aims to recover upper limb control improving the kinesthetic motor imagery (KMI) generation of post-stroke patients using a tangible and haptic interface within a gamified Brain-Computer Interface (BCI) training environment.<\/p>\n<p>&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;<\/p>\n<p class=\"p1\">Project <a href=\"http:\/\/bci-lift.inria.fr\">Inria Project Lab BCI-LIFT<\/a> (Brain-Computer Interfaces: Learning, Interaction, Feedback, Training) 2014-2018\n<\/p>\n<p>Project lead: Maureen Clerc<br \/>\nInria partners: Athena, Camin, Hybrid, Mjolnir, Neurosys, Potioc<br \/>\nExternal partners: Inserm Lyon, Universit\u00e9 de Rouen<\/p>\n<p>BCI-LIFT is a large-scale 4-year research initiative whose aim is to reach a next generation of non-invasive Brain-Computer Interfaces (BCI), <\/p>\n","protected":false},"author":136,"featured_media":0,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"footnotes":""},"class_list":["post-313","page","type-page","status-publish","hentry"],"_links":{"self":[{"href":"https:\/\/members.loria.fr\/LBougrain\/wp-json\/wp\/v2\/pages\/313","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/members.loria.fr\/LBougrain\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/members.loria.fr\/LBougrain\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/members.loria.fr\/LBougrain\/wp-json\/wp\/v2\/users\/136"}],"replies":[{"embeddable":true,"href":"https:\/\/members.loria.fr\/LBougrain\/wp-json\/wp\/v2\/comments?post=313"}],"version-history":[{"count":7,"href":"https:\/\/members.loria.fr\/LBougrain\/wp-json\/wp\/v2\/pages\/313\/revisions"}],"predecessor-version":[{"id":382,"href":"https:\/\/members.loria.fr\/LBougrain\/wp-json\/wp\/v2\/pages\/313\/revisions\/382"}],"wp:attachment":[{"href":"https:\/\/members.loria.fr\/LBougrain\/wp-json\/wp\/v2\/media?parent=313"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}