Sense of Touch in Robots with Self-Organizing Maps

Research output: Contribution to journalArticle

Abstract

We review a number of self-organizing-robot systems that are able to extract features from haptic sensory information. They are all based on self-organizing maps (SOMs). First, we describe a number of systems based on the three-fingered-robot hand, i.e., the Lund University Cognitive Science (LUCS) Haptic-Hand II, that successfully extracts the shapes of objects. These systems explore each object with a sequence of grasps while superimposing the information from individual grasps after cross-coding proprioceptive information for different parts of the hand and the registrations of tactile sensors. The cross-coding is done by employing either the tensor-product operation or a novel self-organizing neural network called the tensor multiple peak SOM (T-MPSOM). Second, we present a system based on proprioception that uses an anthropomorphic robot hand, i.e., the LUCS haptic-hand III. This system is able to distinguish objects both according to shape and size. Third, we present systems that are able to extract and combine the texture and hardness properties from explored materials.

Details

Authors
Organisations
Research areas and keywords

Subject classification (UKÄ) – MANDATORY

  • Computer Vision and Robotics (Autonomous Systems)

Keywords

  • Cognitive robotics, manipulators, self-organizing feature maps, tactile sensors, unsupervised learning
Original languageEnglish
Pages (from-to)498-507
JournalIEEE Transactions on Robotics
Volume27
Issue number3
Publication statusPublished - 2011
Publication categoryResearch
Peer-reviewedYes

Related projects

Birger Johansson, Christian Balkenius, Magnus Johnsson, Rasmus Bååth, Stefan Winberg, Zahra Gharaee & Trond Arild Tjøstheim

2001/01/012021/12/31

Project: Research

View all (2)