The seed of the value chain of the Robotics, Brain and Cognitive Science unit (RBCS) is a “brain centric” approach to interaction science along three main streams of research: 1) humanoid robotics with a focus on cognition; 2) human behavioural studies and rehabilitation with a focus on action and perception, 3) human-machine communication and interaction with a focus on agent-agent cooperation guided by mutual understanding.  A factor, common to all three streams is the focus on learning and development and, in general, on the dynamics of knowledge acquisition and update in the framework of goal directed actions. 

The scientific value is generated by advancing knowledge of ourselves through the investigation of human motor, perceptual and cognitive abilities and the implementation of explanatory embodied models of such abilities. The technological value is produced from scientific results by developing new devices and technologies in three main areas: i) Hardware and software for machine intelligence; ii) Tools for quantitative evaluation of human performance in healthy and disabled persons; iii) Sensory and motor rehabilitation tools and protocols. The social value is produced by exploiting supportive and rehabilitation technologies to improve social inclusion and the quality of life particularly of the weak components of our society.  A network of formalized collaborations and joint labs established with clinical and rehabilitation centres validates this value chain.

RBCS Unit embodies the concept of interdisciplinary by hosting and nurturing collaborations between scientists with very diversified background sharing an interest in producing these scientific, technological and social values.

Following this methodology, the fil-rouge of RBCS research has continued to be “action execution and understanding” with a focus on interaction science and a technological push toward rehabilitation devices for social use.  Through interaction science we are investigating how humans and robots exchange information with the animate and inanimate world through contacts, speech and gestures.

Divisions

RBCS scientists share a research infrastructure supporting psychophysical, behavioural and neurophysiological research based on experimental set-up including devices for motion tracking with force platforms, equipment for transcranial magnetic stimulation and electrophysiological recording (EEG and EMG). A full-fledged iCub humanoid robot is available for Cognitive Robotics experiments and human-robot interaction studies. Ad hoc experimental set-ups and protocols, as well as rehabilitation devices are developed by the mechanical and electronic facilities. Clinical studies and validation of rehabilitation protocols are carried out within several joint labs established in clinical and rehabilitation centres and through national and international collaborations.

The research activity is articulated in four streams of research:

The MLARR group focuses on Robotics and Interaction Technologies for Neuroscience and Neurorehabilitation.  In particular, research is articulated as follows: 1) Developing cutting edge mechatronic and robotic technology to enhance/augment human robot interaction with a special focus on robot aided rehabilitation; 2) Studying the neural plasticity that underlies the organization of the human sensorimotor system in skill acquisition; 3) Integrating technology in the clinical/home environment developing new standards of assessment. 

Publications

  1. Asai Y,  Tasaka Y, Nomura K, Nomura T, Casadio M, Morasso P (2009) A model of postural control in quiet standing: robust compensation of delay-Induced Instability using intermittent activation of feedback control. PLoS One 4(7):e6169.
  2. Zenzeri J, De Santis D, Morasso P (2014) Strategy Switching in the Stabilization of Unstable Dynamics. PLOS One, 9(6): e99087.
  3. De Santis D, Zenzeri J, Casadio M, Masia L, Riva A, Morasso P, Squeri V (2015) Robot-assisted training of the kinesthetic sense: enhancing proprioception after stroke. Frontiers in Human Neuroscience, 8:1037

The scientific aim of the research of this group is to investigate the sensory and motor mechanisms underlying mutual understanding in human-human interaction, with the technological goal of designing robots that can naturally cooperate with people in carrying out everyday tasks. The peculiarity of our approach is that robots, rather than being just the final goal of the research, are used as the ideal tool to investigate social interaction in a principled way.

Publications

  1. Sciutti A., Ansuini C., Becchio C. & Sandini G. 2015, ‘Investigating the ability to read others’ intentions using humanoid robots’, Frontiers in Psychology – Cognitive Science, vol. 6,no. 1362. http://journal.frontiersin.org/article/10.3389/fpsyg.2015.01362
  2. Palinko O., Rea F., Sandini G. & Sciutti A. 2015, ‘Eye Gaze Tracking for a Humanoid Robot’, Proceedings of the International Conference on Humanoids Robots, Seoul, Korea, November 3-5, 2015.
  3. Sandini, G., Noceti, N., Vignolo, A., Sciutti, A., Rea, F., Verri, A., & Odone, F. (2015). ‘Computational Model of Biological Motion Detection: a path toward view-invariant action understanding’. Journal of vision, 15(12), 497-497, VSS Meeting Abstract.
  4. Mosadeghzad M., Rea F., Tata M., Brayda L. & Sandini G. 2015, ‘Saliency Based Sensor Fusion of Broadband Sound Localizer for Humanoids’, 2015 IEEE International Conference on Multisensor Fusion and Information Integration (MFI 2015)

The research activity at the DTI is focused on the hand, the haptic perception of objects and haptic exploration of the environment, the control of finger forces in multi-finger grasp and, more generally, physical interaction with objects or person.  The lab has developed various devices and setup based on robotic and haptic technology and has expertise in kinematic and dynamic analysis of movement, behavioural and psychophysical techniques, and the modelling of perceptual and decisional processes.

Publications

  1. Tatti F, Baud-Bovy, G (2015) Force sharing strategies in a collaborative force detection task. World Haptics Conference (WHC), 2015 IEEE, pp. 463 – 468
  2. http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=7177755&tag=1
  3. Baud-Bovy, G. (2014) The Perception of the Centre of Elastic Force Fields: A Model of Integration of the Force and Position Signals. In Di Luca M. (Ed.), Multisensory Softness, Springer Verlag, London, pp 127-146.
  4. http://link.springer.com/chapter/10.1007/978-1-4471-6533-0_7
  5. Santello M., Baud-Bovy G. and Jorntell H. (2013) Neural bases of hand synergies. Frontiers in Computational Neuroscience, vol. 7, (no. 23), 1662-5188
  6. http://journal.frontiersin.org/article/10.3389/fncom.2013.00023/abstract

We investigate how sensory deprived individuals compensate missing sensory channels by vicarious modalities. Our focus is on sensory enhancement and how to achieve it with novel assistive technologies, mainly aimed at the construction of cognitive maps. Our methodology serves to build hardware/software platforms to decrease the digital divide, therefore increasing social inclusion. We focus on three main topics

Sensory Substitution: How much spatial knowledge depends on visual experience? We investigate the neural and behavioural correlates of tactile spatial representations. 

Small Area Haptic Device: Are we able to understand simple tactile virtual objects? We study how information can be coded, displayed and understood by humans through low-tech haptic displays.

Sensory Supplementation: Is it possible to improve the spatial soundscape of hearing impaired individuals? We study how binaural acoustic feedback can be used in context where hearing loss prevents proper spatial awareness.

Publications

  1. Campus C., Brayda L., De Carli F., Chellali R., Famà F., Bruzzo C., Lucagrossi L. and Rodriguez G. (2012)
    Tactile exploration of virtual objects for blind and sighted people: the role of beta 1 EEG band in sensory substitution and supra-modal mental mapping Journal of Neurophysiology, vol. 107, (no. 10), pp. 2713-2729, 0022-3077
  2. Brayda L. and Chellali R. (2012) Measuring Human-Robots Interactions International Journal of Social Robotics, pp. 1-3, 1875-4791
  3. Campus C., Brayda L., Chellali R., Martinoli C. and Rodriguez G. (2011) A neurophysiological and behavioral investigation of tactile spatial exploration for sighted and non-sighted adults. Human Factors and Ergonomics Society 55th Annual Meeting, September 19-23 2011, vol. 55, (no. 1), pp. 227-231, Las Vegas, Nevada, USA 2011

Laboratories

  • TMS LAB Transcranial Magnetic Stimulation Laboratory, equipped with MagStim stimulator Rapid2 and  Bistim. Power 1401  high-performance data acquisition interface (Cambridge Electronic Design)
  • EEG LAB Electroencephalography Laboratory equipped with 64 channel BrainAmp and two BioSemi 64ch device for testing dual subject simultaneous EEG.
  • MOCAP LAB Motion Capture  laboratory  with ten near infrared camera (VICON), three force platform (AMTI) and 32 channel EMG Wireless (COMETA)
  • OPTOTRACK (NDI)  Motion Capture device using active infrared markers.
  • HRI LAB Human-Robot Interaction lab equipped with an iCub humanoid platform.
  • HPL LAB Haptic Perception Lab Commercial and special purpose Haptic interfaces to study human perception and rehabilitation

Projects

  • The ALLSPEAK project (automatic speech recognition App for amyotrophic lateral sclerosis (ALS) patients, granted by AriSLA, PI: Alberto Inuggi) aims at allowing ALS patients to verbally communicate their primary needs throughout the whole course of their disease. It employs a machine learning approach in order to fit to patients’ residual voice production skills.
  • BLINDPAD project: For visually impaired people it is difficult to digitally get graphical contents increasingly conveyed through sight. The sense of touch can potentially bridge the gap, as it is crucial – in absence of vision – for understanding abstract concepts and acquiring information about the surroundings. The objective of the project is to make graphical contents accessible through touch by building and field-testing a Personal Assistive Device for blind and visually impaired people (BLINDPAD).
  • The GLASSENSE project (Wearable Technologies for Sensory Supplementation) aims at building novel devices to assist hearing-impaired and visually-impaired people in daily tasks, improving their acoustic spatial awareness and providing spatialized tactile feedback.
  • TAMO project is a minimalist tactile device displaying the height of virtual objects on a fingertip, using just one actuator. TAMO exploits height perception, combined with hand movements/proprioception, to display 3D shapes. TAMO helps with developing mental maps from virtual objects. Target applications : Computer-aided Rehabilitation, Orientation and Mobility for blind and visually impaired users, Education and entertainment

Collaborations

  • U-VIP line: Long lasting collaboration with the researchers of the U-VIP in the field of sensorimotor integration, devices for sensory rehabilitation and social inclusion.
  • Ongoing collaboration in the field of Cognitive Robotics and Human-Robot interaction with University of Osaka (Y. Nagai, M.Asada), University of Genova (A. Bisio, F. Odone, A. Verri, N.Noceti), Heriot-Watt University (K. Lohan).
  • Collaborations with Northwestern University (F.A. Mussa Ivaldi) on motor control and neuroreahabilitation and Universität Paderborn (K. Rohlfing) on anticipation in infancy.
  • Collaboration with University of Lethbridge (M. Tata) on attentional system for robots.
  • Joint facility Istituto Italiano di Tecnologia and Pediatric Hospital Gaslini plan to start up a multidisciplinary approach to Pediatric Orthopedics and Neural Rehabilitation using the technology developed by RBCS
  • Joint facility Istituto Italiano di Tecnologia " Fondazione David Chiossone” per ciechi e ipovedenti.