IIT Projects Search

KID

H2020 ERC - Proof of Concept Grant 2018-2019

A low-cost KInematic Detector to assist early diagnosis and objective profiling of ASD

Sommario: Autism spectrum disorders (ASDs) are a heterogeneous set of neurodevelopmental disorders characterized by deficits in social communication and reciprocal interactions, as well as stereotypic behaviours. Although early diagnosis followed by appropriate intervention appears to offer the best chance for significant health improvement and economic gain, diagnosis of autism remains complex and often difficult to obtain. Recent identification of atypical kinematic patterns in children and infants at increased risk for ASDs provides new insights into autism diagnostic and objective profiling. KiD intends to help move these insights into the development of a low cost, easy-to-use, yet reliable wearable tracking system, designed to assist detection and classification of ASDs. The novelty of KiD is to combine informed development of machine learning methods to classify kinematic data with a co-design human factor engineering. KiD holds great potential for translational possibilities into autism clinical practice. The main use of the device will be to assist clinicians to achieve expedited diagnosis, ensuring early and timely access of children at risk of autism to evidence-based intervention programs. Another use will be to examine the quantitative nature of autistic traits, enabling new forms of precision-phenotyping, which is potentially useful for stratifying patients with ASD and developing individualized treatment approaches

Total budget: 148.413,00€

Total contribution: 148.413,00€


I MOVE U

FP7 ERC - Starting Grant 2013-2018

Intention-from-MOVEment Understanding: From moving bodies to interacting minds

Sommario: From observing other people’s movements, humans make inferences that go far beyond the appearance of the observed stimuli: inferences about unobservable mental states such as goals and intentions. Although this ability is critical for successful social interaction, little is known about how – often fast and reliably – we are able to make such inferences. I.MOVE.U intends to provide the first comprehensive account of how intentions are extracted from body motion during interaction with conspecifics. Covert mental states such as intentions become visible to the extent they contribute as dynamic factors to generate the kinematics of a given action. By combining advanced methods in psychophysics and neuroscience with kinematics and virtual reality technologies, this project will study i) to what extent observers are sensitive to intention information conveyed by body movements; ii) what mechanisms and neural processes mediate the ability to extract intention from body motion; iii) how, during on-line social interaction with another agent, agents use their own actions to predict the partner’s intention. These issues will be addressed at different levels of analysis (motor, cognitive, neural) in neurotypical participants and participants with autism spectrum disorders. For the first time, to investigate real-time social interaction, full-body tracking will be combined with online generation of biological motion stimuli to obtain visual biological motion stimuli directly dependent on the actual behavior of participants. I.MOVE.U pioneers a new area of research at the intersection of motor cognition and social cognition, providing knowledge of direct scientific, clinical, and technological impact. The final outcome of the project will result in a new quantitative methodology to investigate the decoding of intention during interaction with conspecifics.

Total budget: 591.508,00€

Total contribution: 591.508,00€