Back to list

Luca Brayda

Team Leader

Research Line

Robotics Brain and Cognitive Sciences


IIT Central Research Labs Genova


Via Morego 30
+39 010 2897 205
Contact Me

Social profiles

twitter facebook twitter facebook twitter


(More information and downloadable material on my personal webpage: )


Luca Giulio Brayda obtained his Master of Science in Computer Science Engineering at Politecnico di Torino in 2003, studying environmental robustness for automatic speech recognition. His Master Thesis was done in collaboration with the Panasonic Speech Technology Laboratory of Santa Barbara, CA, USA. He also obtained a DEA (diplôme d études approfondies) in Information Technology in 2003. In April 2007 he got his PhD, with the label of Doctor Europaeus, from the University of Nice-Sophia Antipolis, working in the Multimedia Department of the Eurecom Institute with Prof.Wellekens. During his PhD he developed new algorithms to exploit the multi-channel audio signal coming from a microphone array and improve performance of Hidden Markov Model-based speech recognizer thanks to a multiple feedback mechanism. Part of his doctorate was spent at the Fondazione Bruno Kessler (formerly: ITC-IRST), Trento, Italy, where he worked on calibration and signal analysis with microphone arrays in noisy and reverberant environments. From 2007 to 2008 he worked at the Centro Supercalcolo Piemonte, Turin, where he joined the OmegaBox Mediacenter project, aiming to create a Linux-based versatile hardware-software platform for public information and entertainment.

Currently, he holds a Team Leader position at the Italian Institute of Technology, in the Robotics, Brain and Cognitive Sciences dept. He has worked in telerobotics and social robotics, with specific emphasis on human-machine interfaces exploiting virtual reality or limited visual feedback. Currently he is working on haptics, cognitive mapping and sensory substitution interfaces, aimed at providing spatial knowledge to visually impaired and blind users with touch. He is coordinator of the FP7 EU STREP project BLINDPAD (, starting at beginning 2014, aimed at the construction and field test of a touch-based personal assistive device for visually impaired users, mainly focussing on the accessibility of graphic and iconic information.


The TActile MOuse (TAMO) project


 The TActile MOuse with threes degrees of freedom (TAMO3)

ConfusionMatrix 01

Result: we show that it is possible to approximate haptic perception of real 3D objects with virtual 3D objects, perceived with one finger only. The approximation is done with a stimulation mode and a novel device combining heigth and inclination cues. Spontaneous motion and proprioception cause exploration strategies which allow to mentally construct unknown geometrical shapes without the need of vision.


Download the paper:

M. Memeo, L. Brayda, "How geometrical descriptors help to build cognitive maps of solid geometry with a 3DOF" tactile mouse, in Proc. of Eurohaptics 2016, London



Result:we investigate the perception of curvature as the relative combination of height and inclination cues, using a novel assistive device. We show that inclination is the most precise cue for inclination, although with higher cognitive requirment. 

Download the poster: 

M. Memeo, L. Brayda, "Mind the bump:effect of geometrical descriptors on the peception of curved surfaces with a novel tactile mouse", presented at Eurohaptics 2016, London



You can also download a brochure of the TAMO3 from here

You can see a video of the TAMO3 here



 The TActile MOuse with one defgree of freedom (TAMO1)






The TActile MOuse is a low-cost assistive device for blind and visually impaired people. It allows the construction of mental maps from virtual objects, using only one tangible pin. 

See some Youtube videos:


- TAMO was found to reflect users' expectations. In particular, subjects can predict their own tactile sensitivity (we can say that "WYTIWYG: What You Touch Is What You Get"). Therefore, users can self-assess their tactile performance, combining their own judgements with objective data. This is crucial since TAMO is thought to be used in home-based scenarios. See more HERE

- TAMO is also used to discover how blind develop spatial exploration strategies, as compared to sighted. There is a long debate about how impaired subjects acquire and imagine maps. This is important to design correctly targeted rehabilitation devices. We found that blind subjects develop the same strategies as their sighted fellows. See a video of the strategies HERE and HERE.

- TAMO elicits similar brain areas in blind than in sighted people. This is important because it shows that spatial abilities can be developed independently from vision capabilities. Read the paper HERE.

- TAMO is a low-cost device. It is built with commercial-off-the-shelf components. If you are interested in it, please send me an e-mail (luca_dot_brayda_at_iit_dot_it)

Link to the old website:




The BLINDPAD project 

This European project will start on January 2014 and will last 3 years. It involves research centers and companies from Italy, Germany, Switzerland, Poland and Hungary. The objective of the project is to make graphical contents accessible through touch by building and field-testing a Personal Assistive Device for BLIND and visually impaired people (BLINDPAD). BLINDPAD will put veridical touch-based information into the hands of users, exploiting and enhancing their residual sensory abilities. BLINDPAD will be a personal, portable and cheap solution to improve knowledge and independence, thus increasing chances of employment, of social inclusion and, ultimately, of a better quality of life.

More info at





 The Glassense project



Abstract: Actuator density is an important parameter in the design of vibrotactile displays. When it comes to obstacle detection or navigation tasks, a high number of tactors may provide more information, but not necessarily better performance. Depending on the body site and vibration parameters adopted, high density can make it harder to detect tactors in an array. In this paper, we explore the trade-off between actuator density and precision by comparing three kinds of directional cues. After performing a within-subject naive search task using a head-mounted vibrotactile display, we found that increasing the density of the array locally provides higher performance in detecting directional cues.Keywords: Haptic Interaction; Tactile Display; Head Stimulation.

Download the paper:

OLIVEIRA, V. A. de J., NEDEL, L., MACIEL, A., BRAYDA, L.,  Localized Magnification in Vibrocatile HMDs for Accurate Spatial Awareness, in Proc. of EUROHAPTICS 2016


Abstract: Several studies evaluated vibrotactile stimuli on the head to aid orientation and communication. However, the acuity for vibration of the head's skin still needs to be explored. In this paper, we report the assessment of the spatial resolution on the head. We performed a 2AFC psychophysical experiment systematically varying the distance between pairs of stimuli in a standard-comparison approach. We took into consideration not only the perceptual thresholds but also the reaction times and subjective factors, like workload and vibration pleasantness. Results show that the region around the forehead is not only the most sensitive, with thresholds under 5mm, but it is also the region wherein the spatial discrimination was felt to be easier to perform. We also have found that it is possible to describe acuity on the head for vibrating stimulus as a function of skin type (hairy or glabrous) and of the distance of the stimulated loci from the head midline.

Download the paper:

OLIVEIRA, V. A. de J., NEDEL, L., MACIEL, A., BRAYDA, L,  Spatial Discrimination of Vibrotactile Stimuli around the Head, in PRoc of Haptics Symposium, Philadelphia, 2016








Selected Publications

Publication list

Tonelli, Alessia, Luca Brayda, and Monica Gori 2015
Task-dependent calibration of auditory spatial perception through environmental visual observation
Frontiers in systems neuroscience 9 (2015).Geronazzo M., Bedini A., Brayda L., Campus C. and Avanzini F.
Geronazzo M., Bedini A., Brayda L., Campus C. and Avanzini F. 2015
Interactive spatial sonification for non-visual exploration of virtual maps
International Journal of Human-Computer Studies
DOI: 10.1016/j.ijhcs.2015.08.004
L. Brayda, C. Campus, M. Memeo, L. Lucagrossi 2015
The importance of visual experience, gender and emotion in the assessment of an assistive tactile mouse
IEEE Transaction on Haptics,
Special issue : Haptic Assistive Technology for Individuals who are Visually Impaired
DOI: 10.1109/TOH.2015.2426692
L. Brayda, C. Campus, M. Gori 2013
Predicting successful tactile mapping of virtual objects
IEEE Transaction on Haptics
Volume:6 ,  Issue: 4 , pages 473-483, DOI: 10.1109/TOH.2013.49

L. Brayda, R. Chellali, 2012

Measuring Human-Robot Interactions, Editorial of Special Issue

International Journal of Social Robotics, Springer. DOI: 10.1007/s12369-012-0150-2
alt PDF
C. Campus, L. Brayda, F. De Carli, R. Chellali, F. Famà, C. Bruzzo, L. Lucagrossi, G. Rodriguez
Tactile exploration of virtual objects for blind and sighted people: the role of beta 1 EEG band in sensory substitution and supra-modal mental mapping, Journal of Neurophysiology, May 15, 2012 vol. 107 no. 102713-2729, DOI:10.1152/jn.00624.2011
alt PDF
N. Mollet, R. Chellali, L. Brayda
Virtual and Augmented Reality tools for teleoperation: improving distant immersion and perception

in Transaction on Edutainment II, pp. 135-159, 2009
DOI 10.1007/978-3-642-03270-7_10
 alt PDF

Book chapter:

N. Mollet, R. Chellali, L. Brayda
Choosing the Tools for Improving Distant Immersion and Perception in a Teleoperation Context
Remote & TeleroboticsIN-TECH, ISBN 978-953-307-081-0, publishing date: March 2010
 alt PDF

Patent & Prizes:

X. Anguera Miro, R. Kuhn, L. Brayda and J.-C.Junqua
Interactive personalized robot for home use
Patent number: US 10/739,915 , March 2008

K. Haynes, L. Brayda
Design of an Electronic paper-based Embedded System
First Prize at the Next Generation Entrepreneur Forum (NGEF), Monaco University

L. Brayda, C. Campus, M.Gori
What you touch is what you get: self-assessing a minimalist sensory substitution device
in Proc. of IEEE World of Haptics Conference 2013, Daejeong, Korea
alt PDF

L. Brayda, C. Campus
Conveying perceptible virtual tactile maps with a minimalist sensory substitution device
in Proc. of IEEE symposium on Haptic Audio-Visual Environments and games (IEEE HAVE 2012), 8-9 Oct 2012, Munich, Germany, 2012
 alt PDF

L. Brayda, C. Campus, R. Chellali, G. Rodriguez, C. Martinoli
An investigation of search behaviour in a tactile exploration task for sighted and non-sighted adults
in Proc. of ACM International conference on Human factors in computing systems (ACM CHI 2011), 7-12 May 2011, Vancouver, Canada. 2011 
DOI 10.1145/1979742.1979857
alt PDF

Campus, C., Brayda, L., Chellali, R., Rodriguez, G., Martinoli, C.
A neurophysiological and behavioral investigation of tactile spatial exploration for sighted and non-sighted adults
in Proc. of Human Factors and Ergonomics Society 55th Annual Meeting, September 19-23 2011, Las Vegas, Nevada, USA 2011
DOI 10.1177/1071181311551047
L. Brayda, C. Campus, R. Chellali, G. Rodriguez
Objective evaluation of spatial information acquisition using a visuo-tactile sensory substitution device
in Proc. of International Conference of Social Robotics, 23-24 November 2010, Singapore
ISBN:3-642-17247-4 978-3-642-17247-2
 alt PDF

L. Brayda, C. Campus, G. Rodriguez, R. Chellali
Understanding the transfer from tactile feedback to spatial 3D representation through EEG using a virtual reality-based sensory substitution device
in Proc. of Laval Virtual - 12th Virtual Reality International Conference (VRIC 2010), 7-9 April 2010, Laval, France.

J. Ortiz, L. Brayda, R. Chellali, N. Mollet, J-G. Fontaine,
Measuring the effects of visual feedback on mobile robot teleoperation
in Proc. of Laval Virtual - 12th Virtual Reality International Conference (VRIC 2010), 7-9 April 2010, Laval, France.
 alt PDF

L. Brayda, J. Ortiz, R. Chellali, N. Mollet and J-G. Fontaine
Seeking perceptual-based metrics to assess the visuo-motor loop in mobile robot teleoperation
in Proc. of IEEE 4th Int. Conf. on Cybernetics and Intelligent Systems - Robotics, Automation and Mechatronics (CIS-RAM 2010), June 2010, Singapore
alt PDF
L. Brayda, J. Ortiz, N. Mollet, R. Chellali, J.-G. Fontaine
Quantitative and qualitative evaluation of vision-based teleoperation of a mobile robot
 alt PDF
in Proc. of 2nd Int. Conf. on Intelligent Robotics & Applications (ICIRA 2009), Singapore, December 2009

L. Brayda, N. Mollet, R. Chellali
Mixing Telerobotics and Virtual Reality for Improving Immersion in Artwork Perception
 alt PDF
in Proc. of Int. Conf. on E-Learning & Games (Edutainment 2009), Banff, Canada, August 2009

R. Chellali, L. Brayda, C. Martinoli, E. Fontaine

How taxel-based displaying devices can help visually impaired people to navigate safely
 alt PDF
in Proc. of 4th Int. Conf. on Autonomous Robots & Agents (ICARA 2009), Wellington, New Zealand, February 2009

N. Mollet, L. Brayda, R. Chellali and JG. Fontaine
Virtual Environments and Scenario Languages for Advanced Teleoperation of Groups of Real Robots: Real Case Application
in Proceedings of Advanced Human Computer Interaction, Cancun, Mexico, 1-6 feb 2009

N. Mollet, L. Brayda, R. Chellali and B. Khelifa
Standardization and integration in robotics: case of Virtual Reality tools
 alt PDF
in Proc. of Cyberworlds 2008, Hangzhou, China, 22-24 sept 2008

N. Mollet and L. Brayda and K. Baizid and R. Chellali
Virtual and Augmented Reality with head-tracking for efficient teleoperation of groups of robots

in Proc. of Cyberworlds 2008, Hangzhou.

L. Brayda, C. Wellekens, M. Matassoni, M. Omologo
peech recognition in reverberant environments using remote microphones
 alt PDF
in Proc. of IEEE International Symposium on Multimedia 2006, San Diego, CA, USA

L. Brayda, C. Wellekens, M. Omologo
Reconnaissance robuste de parole en environnement réel à l'aide d'un réseau de microphones à formation de voie adaptative basée sur un critère des N-best Vraisemblance Maximales
in Proceedings of JEP 2006, Dinard, France

L. Brayda, C. Wellekens, M. Omologo
Improving robustness of a likelihood-based beamformer in a real environment for automatic speech recognition
 alt PDF
in Proceedings of Specom 2006, St Petersbourg, Russia.

L. Brayda, C. Wellekens, M. Omologo
N-Best parallel Maximum Likelihood Beamformers for Robust Speech Recognition 
alt PDF
in Proceedings of EUSIPCO 2006, Firenze, Italy.

L. Brayda, C. Bertotti, L. Cristoforetti, M. Omologo, P. Svaizer
Modifications on NIST MarkIII array to improve coherence properties among input signals alt PDF
in 118th Convention of Audio Engineering Society, 2005

L. Brayda, L. Rigazio, R. Boman and J-C. Junqua
Sensitivity Analysis of Noise Robustness Methods 
alt PDF
in Proceedings of IEEE ICASSP 2004, Montreal, Canada.


C. Campus, L. Brayda, G. Rodriguez, R. Chellali
Evaluating visuo-tactile sensory substitution for navigation in virtual worlds: preliminary neurophysiological assessment and results on a tactile-based interface
Tactile sensing workshop, within the 9th IEEE-RAS International Conference on Humanoid Robots (Humanoids09), Paris, France, December 7-10, 2009

L. Brayda, L. Taverna, L. Rossi, R. Chellali
Collaborative Virtual Environments for Sharing Product lifecycle alt PDF
in Proceedings of 2nd International Workshop on Virtual Manufacturing (VIRMAN08), INTUITION 2008, Turin, Italy

L. Brayda, C. Bertotti, L. Cristoforetti, M. Omologo, P. Svaizer
On Calibration and Coherence signal analysis of the CHIL microphone network at IRSTalt PDF
in Proceedings of Joint Workshop for Hands-Free Speech Communication and Microphone Arrays 2005, Piscataway, New Jersey, USA. 

Research Reports:

L. Brayda
Multi-Hypotheses Feedback for Robust Speech Recognition using a Microphone Array input
 alt PDF
PhD Thesis issued by University of Nice-Sophia Antipolis (UNSA) and Institut Eurecom, France
in collaboration with: Fondazione Bruno Kessler (FBK), Trento, Italy

L. Brayda
Metodi di compensazione del Rumore basati sul Segnale e sui Modelli per Riconoscimento Automatico della Parola alt PDF
Master Thesis issued by Politecnico di Torino, Italy
in collaboration with: Panasonic Speech Technology Laboratory (PSTL), Santa Barbara, CA, USA
C. Bertotti, L. Brayda, L. Cristoforetti, M. Omologo, P.Svazier
The new MarkIII/IRST-Light microphone array 
alt PDF

C. Bertotti, L. Brayda, L. Cristoforetti, M. Omologo, P.Svazier
The MarkIII microphone array: the modified version realized at ITC-IRST
alt PDF

Some of the publications are available on-line to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder. In addition, the following notice should be reproduced on copies of articles published by the IEEE: C year IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.


K. Haynes, L. Brayda
Design of an Electronic paper-based Embedded System
First Prize at the Next Generation Entrepreneur Forum (NGEF), Monaco University, 2006

The BLINDPAD project ranked 1st out of 54 in the list of proposals of the call FP7-ICT-2013-10 for the objective ICT-2013.5.3 “ICT for smart and personalised inclusion”.


Il sito web di IIT utilizza i seguenti tipi di cookies: di navigazione/sessione, analytics, di funzionalità e di terze parti. L'utente può scegliere di acconsentire all'uso dei cookies e accedere al sito. Facendo click su "Maggiori Informazioni" verrà visualizzata l'informativa estesa sui tipi di cookies utilizzati e sarà possibile scegliere se autorizzarli durante la navigazione sul sito.
Maggiori Informazioni
Accetta e chiudi