{"id":788,"date":"2019-12-20T18:12:37","date_gmt":"2019-12-20T16:12:37","guid":{"rendered":"http:\/\/members.loria.fr\/SIvaldi\/?page_id=788"},"modified":"2021-08-26T09:33:29","modified_gmt":"2021-08-26T07:33:29","slug":"datasets","status":"publish","type":"page","link":"https:\/\/members.loria.fr\/SIvaldi\/datasets\/","title":{"rendered":"Datasets"},"content":{"rendered":"<h3>1) Overhead work with and without the exoskeleton PAEXO<\/h3>\n<table  class=\" table table-hover\" >\n<tbody>\n<tr>\n<td width=\"30%\"><a href=\"http:\/\/members.loria.fr\/SIvaldi\/wp-content\/blogs.dir\/70\/files\/sites\/70\/2020\/01\/setup_picture-scaled.jpg\"><img decoding=\"async\" class=\"alignleft wp-image-806 size-medium\" src=\"http:\/\/members.loria.fr\/SIvaldi\/wp-content\/blogs.dir\/70\/files\/sites\/70\/2020\/01\/setup_picture-172x300.jpg\" alt=\"\" width=\"300\" srcset=\"https:\/\/members.loria.fr\/SIvaldi\/wp-content\/blogs.dir\/70\/files\/sites\/70\/2020\/01\/setup_picture-172x300.jpg 172w, https:\/\/members.loria.fr\/SIvaldi\/wp-content\/blogs.dir\/70\/files\/sites\/70\/2020\/01\/setup_picture-586x1024.jpg 586w, https:\/\/members.loria.fr\/SIvaldi\/wp-content\/blogs.dir\/70\/files\/sites\/70\/2020\/01\/setup_picture-768x1342.jpg 768w, https:\/\/members.loria.fr\/SIvaldi\/wp-content\/blogs.dir\/70\/files\/sites\/70\/2020\/01\/setup_picture-879x1536.jpg 879w, https:\/\/members.loria.fr\/SIvaldi\/wp-content\/blogs.dir\/70\/files\/sites\/70\/2020\/01\/setup_picture-1172x2048.jpg 1172w, https:\/\/members.loria.fr\/SIvaldi\/wp-content\/blogs.dir\/70\/files\/sites\/70\/2020\/01\/setup_picture-scaled.jpg 1465w\" sizes=\"(max-width: 172px) 100vw, 172px\" \/><\/a><\/td>\n<td>P. Maurice, J. \u010camernik, D. Gorjan, B. Schirrmeister, J. Bornmann, L. Tagliapietra, C. Latella, D. Pucci, S. Ivaldi, J. Babi\u010d. (2018) &#8220;AndyData-lab-onePersonWithExoskeleton&#8221;. DOI : 10.5281\/zenodo.1472214.<br \/>\nUrl: <a href=\"https:\/\/zenodo.org\/record\/1472214\">https:\/\/zenodo.org\/record\/1472214<\/a><br \/>\nRelated publication: [<a href=\"https:\/\/hal.archives-ouvertes.fr\/hal-02301922\/file\/Exoskeleton_TNSRE.pdf\">TNSR 2020<\/a>]<br \/>\nThis is one of the few open datasets with human movements performed with and without an exoskeleton, and the only available open-source dataset with Ottobock&#8217;s PAEXO exoskeleton.<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<h3>2) Sequences of typical manufacturing activities<\/h3>\n<table  class=\" table table-hover\" >\n<tbody>\n<tr>\n<td width=\"30%\"><a href=\"http:\/\/members.loria.fr\/SIvaldi\/wp-content\/blogs.dir\/70\/files\/sites\/70\/2020\/01\/Inria-0307-060.jpg\"><img decoding=\"async\" class=\"alignleft wp-image-806 size-medium\" src=\"http:\/\/members.loria.fr\/SIvaldi\/wp-content\/blogs.dir\/70\/files\/sites\/70\/2020\/01\/Inria-0307-060.jpg\" alt=\"\" width=\"300\" \/><\/a><\/td>\n<td>P. Maurice, A. Malais\u00e9, C. Amiot, N. Paris, G.J. Richard, O. Rochel, L. Fritzsche, S. Ivaldi (2018) &#8220;AndyData-lab-onePerson&#8221;. DOI : 10.5281\/zenodo.1471975.<br \/>\nUrl: <a href=\"https:\/\/zenodo.org\/record\/3254403\">https:\/\/zenodo.org\/record\/3254403<\/a><br \/>\nRelated publication: [<a href=\"https:\/\/hal.archives-ouvertes.fr\/hal-02289107\/document\">IJRR 2019<\/a>],[<a href=\"https:\/\/hal.archives-ouvertes.fr\/hal-01985013\/document\">RA-L 2019<\/a>].<br \/>\nThis dataset contains a variety of movements relevant for the manufacturing domain, captured with wearable sensors and external motion tracking system. They are performed with 13 different subjects, for a total of 5+ hours. It is our main dataset for learning action classifiers and predictive models.<a href=\"https:\/\/andydataset.loria.fr\/\">=&gt; check the website of this dataset for more information!<\/a><\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<p>&nbsp;<\/p>\n<h3>3) Whole-body movements exerting forces: posture &amp; feet forces<\/h3>\n<table  class=\" table table-hover\" >\n<tbody>\n<tr>\n<td width=\"30%\"><a href=\"http:\/\/members.loria.fr\/SIvaldi\/wp-content\/blogs.dir\/70\/files\/sites\/70\/2020\/01\/Inria-0307-057-1.jpg\"><img decoding=\"async\" class=\"alignleft wp-image-806 size-medium\" src=\"http:\/\/members.loria.fr\/SIvaldi\/wp-content\/blogs.dir\/70\/files\/sites\/70\/2020\/01\/Inria-0307-057-1-200x300.jpg\" alt=\"\" width=\"300\" \/><\/a><\/td>\n<td>A. Malais\u00e9, P. Maurice, O. Rochel, F. Colas, S. Ivaldi (2018) &#8220;AndyData-lab-onePersonWithShoes&#8221;. DOI : 10.5281\/zenodo.1472122.<br \/>\nUrl: <a href=\"https:\/\/zenodo.org\/record\/1472122\">https:\/\/zenodo.org\/record\/1472122<\/a><br \/>\nThis dataset is the first open-source dataset of whole-body movements performed with the wearable motion capture Xsens suit and the IIT\/Xsens wearable shoes with onboard force\/torque sensors.<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<p>&nbsp;<\/p>\n<h3>4) Human-human co-manipulation in collaborative and cooperative scenarios<\/h3>\n<table  class=\" table table-hover\" >\n<tbody>\n<tr>\n<td width=\"30%\"><a href=\"http:\/\/members.loria.fr\/SIvaldi\/wp-content\/blogs.dir\/70\/files\/sites\/70\/2020\/11\/HHI_scenario.jpg\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone wp-image-1023 size-medium\" src=\"http:\/\/members.loria.fr\/SIvaldi\/wp-content\/blogs.dir\/70\/files\/sites\/70\/2020\/11\/HHI_scenario-300x235.jpg\" alt=\"\" width=\"300\" height=\"235\" srcset=\"https:\/\/members.loria.fr\/SIvaldi\/wp-content\/blogs.dir\/70\/files\/sites\/70\/2020\/11\/HHI_scenario-300x235.jpg 300w, https:\/\/members.loria.fr\/SIvaldi\/wp-content\/blogs.dir\/70\/files\/sites\/70\/2020\/11\/HHI_scenario.jpg 600w\" sizes=\"auto, (max-width: 300px) 100vw, 300px\" \/><\/a><\/td>\n<td>W.Gomes, P. Maurice, S. Ivaldi (2020) &#8220;Andy Data Human Human Object Co-manipulation&#8221;. DOI : 10.5281\/zenodo.3989616.<br \/>\nUrl: <a href=\"https:\/\/doi.org\/10.5281\/zenodo.3989616\">https:\/\/doi.org\/10.5281\/zenodo.3989616<\/a><br \/>\nThis open-source dataset contains experimental data for two object manipulation experiments: a human dyad executing a shared co-manipulation task (20 subjects), and a human executing the same task (10 subjects). Collaborative (no fixed roles) and cooperative (fixed roles &#8211; leader\/ follower) interaction data are recorded.<br \/>\nThe collected data is comprised of kinematic data of the subjects&#8217; arm; raw EMG signals from the subjects&#8217; arm; and maximum value of co-contraction for each measured muscle. Additionally, there is data regarding the task performance for each dyad. These recordings were captured with a Qualisys system for motion capture, as well as Delsys EMG sensors for muscle activity signals.<br \/>\nRelated publication: [<a href=\"https:\/\/hal.archives-ouvertes.fr\/hal-03283057\/document\">poster<\/a>]<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<p>&nbsp;<\/p>\n<h3>5) Whole-body teleoperation of a humanoid by a human operator<\/h3>\n<table  class=\" table table-hover\" >\n<tbody>\n<tr>\n<td width=\"30%\"><a href=\"http:\/\/members.loria.fr\/SIvaldi\/wp-content\/blogs.dir\/70\/files\/sites\/70\/2021\/07\/iCub_teleop.png\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone wp-image-1153 size-medium\" src=\"http:\/\/members.loria.fr\/SIvaldi\/wp-content\/blogs.dir\/70\/files\/sites\/70\/2021\/07\/iCub_teleop-248x300.png\" alt=\"\" width=\"248\" height=\"300\" srcset=\"https:\/\/members.loria.fr\/SIvaldi\/wp-content\/blogs.dir\/70\/files\/sites\/70\/2021\/07\/iCub_teleop-248x300.png 248w, https:\/\/members.loria.fr\/SIvaldi\/wp-content\/blogs.dir\/70\/files\/sites\/70\/2021\/07\/iCub_teleop.png 474w\" sizes=\"auto, (max-width: 248px) 100vw, 248px\" \/><\/a><\/td>\n<td>L. Penco, J.-B. Mouret, S. Ivaldi (2021) &#8220;AndyData-lab-onePersonTeleoperatingICub&#8221;. DOI : https:\/\/doi.org\/10.5281\/zenodo.4906336<br \/>\nUrl: <a href=\"https:\/\/zenodo.org\/record\/4906336#.YQKvF5MzYoE\">https:\/\/zenodo.org\/record\/4906336#.YQKvF5MzYoE<\/a><br \/>\n<span style=\"font-family: inherit;font-size: inherit\">This dataset contains physical measurements of a human operator teleoperating the humanoid robot iCub. The operator controlled the robot while performing different tasks: reaching a low\/mid-height target with the right hand, reaching a high target with the right hand, picking up a box located at different locations and placing the box at different locations. The operator&#8217;s whole-body kinematics was recorded with an Xsens MVN suit.<br \/>\nRelated publication: [<a href=\"https:\/\/arxiv.org\/pdf\/2107.01281.pdf\">arxiv<\/a>]<\/span><\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n","protected":false},"excerpt":{"rendered":"<p>1) Overhead work with and without the exoskeleton PAEXO P. Maurice, J. \u010camernik, D. Gorjan, B. Schirrmeister, J. Bornmann, L. Tagliapietra, C. Latella, D. Pucci, S. Ivaldi, J. Babi\u010d. (2018) &#8220;AndyData-lab-onePersonWithExoskeleton&#8221;. DOI : 10.5281\/zenodo.1472214. Url: https:\/\/zenodo.org\/record\/1472214 Related publication: [TNSR 2020] This is one of the few open datasets with human movements performed with and without [&hellip;]<\/p>\n","protected":false},"author":53,"featured_media":0,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"footnotes":""},"class_list":["post-788","page","type-page","status-publish","hentry"],"_links":{"self":[{"href":"https:\/\/members.loria.fr\/SIvaldi\/wp-json\/wp\/v2\/pages\/788","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/members.loria.fr\/SIvaldi\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/members.loria.fr\/SIvaldi\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/members.loria.fr\/SIvaldi\/wp-json\/wp\/v2\/users\/53"}],"replies":[{"embeddable":true,"href":"https:\/\/members.loria.fr\/SIvaldi\/wp-json\/wp\/v2\/comments?post=788"}],"version-history":[{"count":18,"href":"https:\/\/members.loria.fr\/SIvaldi\/wp-json\/wp\/v2\/pages\/788\/revisions"}],"predecessor-version":[{"id":1157,"href":"https:\/\/members.loria.fr\/SIvaldi\/wp-json\/wp\/v2\/pages\/788\/revisions\/1157"}],"wp:attachment":[{"href":"https:\/\/members.loria.fr\/SIvaldi\/wp-json\/wp\/v2\/media?parent=788"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}