[DI-DTPC]-Banner

Detection, Tracking, Prediction and Control – Playing air hockey

[DI-DTPC]-Tab

Reaching objects with very precise timing is still a challenging problem for robots, due to both perception and control latencies. We aim at making the iCub robot able to smoothly interact with fast-moving objects in dynamic environments, relying only on its vision sensors. To do so, we use as test-bench the air hockey task, because it is a 2D constrained environment ideal for testing high-speed motion planning, simple to realize in a normal-size laboratory and characterized by high uncertainty, high-variable trajectories, and presence of disturbances. We will improve tracking robustness and integrate prediction and control strategies that exploit the asynchronous sampling of target trajectories.

We designed the simplest architecture to perform the air hockey task on iCub, composed of an event-based tracker and a kinematic controller. To simplify the control, the end-effector is constrained to move along one direction, parallel to the air hockey goal. We implemented an ad-hoc control task to cover the maximum space and keep the head as stationary as possible. As the torso is tilted to look at the air hockey play, the robot arm will move within the image plane, generating a lot of "distracting" events. The hand-eye coordination is possible through a mapping between the visual and the robot space. However, we developed an ego-motion compensation module to reduce event-clutter in the background.

Event-based tracking, Hand-eye Coordination, Ego-motion, Robot Control.