Clothes Perception and Manipulation (CloPeMa) – FP7 Project
This project aims to advance the state of the art in the autonomous perception and manipulation of all kinds of fabrics, textiles and garments.
The novelty and uniqueness of this project is due chiefly to its generality. Various garments will be presented in a random pile on an arbitrary background and novel ways of manipulating them (sorting, folding, etc.) will be learned on demand in a real-life dynamic environment. The removal of previously indispensable deterministic assumptions about the modes of the textiles presentation and handling is expected to lead to greater robustness, reliability, and a wider field of applications.
CloPeMa’s main objective is closer integration of perception, action, learning, and reasoning. Perception means integrated haptic and visual sensing, recognition, and support for perception-action reactive cycle. Actions will be performed by a cooperating pair of robotic hands, part of the CloPeMa experimental testbed duplicated in three project partners. The hands will combine all state of the art solutions for manipulation of limp material: variable impedance actuation on a compliant hand mechanism architecture using smart materials and tactile sensorization with artificial skin on large areas.
Goals and actions will be learned in a dynamic environment. All components will be subject to statistical learning, spatial reasoning, and high-level reasoning. Thus integrated, CloPeMa is aimed at functionalities that had hitherto proved elusive for systems using only some of these parts in isolation.
Data and procedures to obtain them on the experimental testbed will be collected and made public. Results will be measured and analysed within three carefully defined demonstrator projects of increasing difficulty.
The consortium includes expertise in all the component areas, as well as industrial involvement promising cross-fertilisation and applicability. Both basic exploratory research and implementations of its results are foreseen.