More videos in my Youtube channel.


Human-robot interaction



In this video, iCub tracks "active" human partners (i.e. people talking) using a multimodal approach. It combines a 3D people tracking system with sound source localization and face detection.

References:
S. Ivaldi, N. Lyubova, D. Gérardeaux-Viret, A. Droniou, S. M. Anzalone, M. Chetouani, D. Filliat, and O. Sigaud, "Perception and human interaction for developmental learning of objects and affordances", in Proc. IEEE-RAS Int. Conf. on Humanoid Robots (HUMANOIDS), 2012. (pdf)

S.M. Anzalone, S. Ivaldi, O. Sigaud, M. Chetouani (2012). "Multimodal people engagement with iCub". Proc. Int. Conf. on Biologically Inspired Cognitive Architectures. Palermo, Italy. (pdf)

Software for the experiments: code (svn repository)
Instructions for reproducing the experiments: online documentation