Active Constraints Technologies for Ill-defined or Volatile Environments
The ACTIVE project exploits ICT and other engineering methods and technologies for the design and development of an integrated redundant robotic platform for neurosurgery. A light and agile redundant robotic cell with 20 degrees-of-freedom (DoFs) and an advanced processing unit for pre- and intra-operative control will operate both autonomously and cooperatively with surgical staff on the brain, a loosely structured environment.
As the patient will not be considered rigidly fixed to the operating table and/or to the robot, the system will push the boundaries of the state of the art in the fields of robotics and control for the accuracy and bandwidth required by the challenging and complex surgical scenario.
Two cooperating robots will interact with the brain that will deform for the tool contact, blood pressure, breathing and deliquoration. Human factors are considered by allowing easy interaction with the users through a novel haptic interface for tele-manipulation and by a collaborative control mode ("hands-on"). Force and video feedback signals will be provided to surgeons. Active constraints will limit and direct tool tip position, force and speed preventing damage to eloquent areas, defined on realistic tissue models updated on-the-field through sensors information. The active constraints will be updated (displaced) in real time in response to the feedback from tool-tissue interactions and any additional constraints arising from a complex shared workspace. The overarching control architecture of ACTIVE will negotiate the requirements and references of the two slave robots.
The operative room represents the epitome of a dynamic and unstructured volatile environment, crowded with people and instruments. The workspace will thus be monitored by environmental cameras, and machine learning techniques will be used for the safe workspace sharing. Decisions about collision avoidance and downgrading to a safe state will be taken autonomously, the movement of the head of the patient will be filtered by a bespoke active head frame, while fast and unpredictable patient motion will be compensated by a real-time cooperative control system. Cognitive skills will help to identify the target location in the brain and constrain robotic motions by means of on-field observations.
Project budget: 7.617.326 EUR
Duration: 01/04/2011 – 31/03/2015