Applications of robotics are widely spread. Robots often are complex and sophisticated systems requiring high computational power. Whereas current robots for automation are often very high performance and reliable machines, they still are hardly capable of sharing most everyday life tasks with humans, specially in terms of autonomous behaviour and of user-friendly human-machine interfaces. More particularly, they are hardly capable of manipulation or interaction with unknown or unpredictable objects or living beings.
Examples of everyday life tasks that have received much attention for many years from both neuro-physiological and robotics communities are grasp and lift tasks. Two aspects are crucial for a stable grasp: the ability of the hardware and software system to avoid object slip and the ability to control in real-time the grasping force. Neuro-physiological studies allowed a deep understanding of these two aspects in humans, analysing grasping in its simplest configuration, i.e. in which an object is grasped between the opposed thumb and index fingers and lifted. The role of skin mechanoreceptors during grasp, the mechanisms of motor coordination as well as the strategies used in humans to avoid object slip were widely investigated. In particular, it was shown that the adaptation of the grip force (the grasping force) to the friction between the skin and the object takes place in the first 0.1 s after the initial contact. It furthermore has been found that, after lifting of the object, secondary adjustments of the force balance can occur in response to small short-lasting slips, revealed as vibrations (detected mainly by Pacinian corpuscles).
The concept of incipient slippage was introduced to indicate the micro-vibrations in the peripheral regions of the contact area that appear just before macroscopic slip occurs. The detection of such small vibrations is one of the main methods used to assure grasp stability in robotic grasping. One proposed solution was to use dynamic sensors (e.g. piezoelectric strips) to detect incipient slippage, another was based on measurement of the friction coefficient between the object and fingers.
Slip detection may be based on frequency content analysis on outputs of a sensor array or e.g. on the combined use of strain sensitive sensors and artificial neural networks. Another proposed solution is to use a fuzzy-based controller: the inputs to the controller are relative velocity and acceleration between object and fingers, as well as the frequency content of the force distribution on the capacitive sensor array obtained through fast Fourier transform (FFT) analysis, while its output is the closing speeds of the fingers. The main drawbacks of these methods are their high computational requirements, which make them unsuitable for real-time hand control and rapid reactions if the number of sensors is high.
Recently, incipient slip was detected through vision based analysis of the deformation of the fingertip. From the side of dextrous manipulation control, it was proposed to use dynamic sensing (PVDF strips) to detect tactile events, such as onset or offset of contact between finger and object, or micro-vibrations during the onset of a slip event, or external perturbations, associated with transitions between specific phases of the manipulation task, thus introducing an event-driven grasp controller.
For an artificial tactile system to be useful not only in robotic grasping but more in general for manipulation and human-robot interaction purposes, the localization of the stimulus is a must, thus making a dense and diffused array of tactile sensors essential.
Whereas significant progress has been made in the last years in the fabrication and miniaturization of tactile sensors and sensors arrays, implementing these in robotics for manipulation and/or human—robot interaction purposes and its corresponding challenges is still to a large extent unsolved. Slip detection is only one example of problems with sensors arrays that need to be addressed.