ANDY (2017-2020)

andy-logo

Advancing Anticipatory Behaviors in Dyadic Human-Robot Collaboration

(H2020 ICT Robotics RIA; 2017-2020)

Recent technological progress permits robots to actively and safely share a common workspace with humans. However, most applications target co-existance, and not really “collaboration” (see the latin collaborare – meaning to work with, to help). To have an efficient collaboration with humans, robots need the ability to control physical interaction, predicting the human partner’s intention, motion and dynamics. To achieve this interaction, AnDy relies on three technological and scientific breakthroughs. First, AnDy will innovate the way of measuring human whole-body motions by developing the wearable AnDySuit, which tracks motions and records forces. Second, AnDy will develop the AnDyModel, which combines ergonomic models with cognitive predictive models of human dynamic behavior in collaborative tasks, which are learned from data acquired with the AnDySuit. Third, AnDy will propose the AnDyControl, an innovative technology for assisting humans through predictive physical control, based on AnDyModel. By measuring and modeling human whole-body dynamics, AnDy provides robots with an entirely new level of awareness about human intentions and ergonomy. By incorporating this awareness on-line in the robot’s controllers, AnDy paves the way for novel applications of physical human-robot collaboration in manufacturing, health-care, and assisted living.

Our team

Serena Ivaldi, Adrien Malaisé (PhD student), Waldez Azevedo Gomes Jr (PhD student), Pauline Maurice (postdoc/researcher), Luigi Penco (engineer/PhD student)

Partners

  • Italian Institute of Technology (Francesco Nori – coordinator)
  • Josef Stefan Institute (Jan Babic)
  • DLR (Freek Stulp)
  • XSens
  • AnyBody Technologies
  • IMK Automotive Gmbh
  • Ottobock Gmbh

Highlights

Tele-operation of the iCub humanoid robot (Penco et al., HUMANOIDS 2018)

Prediction of whole-body movements after few observations using AE-ProMPs, which combine the predictive properties of ProMPs with the latent space representation of auto-encoders (Dermy et al., HUMANOIDS 2018)

Activity recognition and taxonomies for automatic ergonomics assessment (Malaise et al., RA-L 2018)

Main publications

  • Project overview:
    • Ivaldi, S.; Fritzsche, L.; Babic, J.; Stulp, F.; Damsgaard, M.; Graimann, B.; Bellusci, G.; Nori, F. (2017) Anticipatory models of human movements and dynamics: the roadmap of the AnDy project. Proc. International Conf. on Digital Human Models (DHM). [PDF]
  • Prediction of human intention & movement:
    • Dermy, O.; Paraschos, A.; Ewerton, M.; Peters, J.; Charpillet, F.; Ivaldi, S. (2017) Prediction of intention during interaction with iCub with Probabilistic Movement Primitives. Frontiers in Robotics & AI, 4:45, doi: 10.3389/frobt.2017.00045. [PDF]
  • Activity recognition for ergonomics:
    • Malaisé, A.; Maurice, P.; Colas, F.; Charpillet, F.; Ivaldi, S. (2018) Activity recognition with multiple wearable sensors for industrial applications. Proc. 11th Int. Conf. on Advances in Computer-Human Interactions (ACHI) [PDF]
  • Ethical issues and societal impact:
    • Maurice, P.; Allienne, L.; Malaise, A.; Ivaldi, S. (2018) Ethical and Social Considerations for the introduction of Human-Centered Technologies at Work. IEEE Workshop on Advanced Robotics and its Social Impacts (ARSO). [PDF]

All the videos of the research in AnDy