Sense of Touch in Robots with Self-Organizing Maps

Magnus Johnsson, Christian Balkenius

Research output: Contribution to journalArticlepeer-review

26 Citations (SciVal)


We review a number of self-organizing-robot systems that are able to extract features from haptic sensory information. They are all based on self-organizing maps (SOMs). First, we describe a number of systems based on the three-fingered-robot hand, i.e., the Lund University Cognitive Science (LUCS) Haptic-Hand II, that successfully extracts the shapes of objects. These systems explore each object with a sequence of grasps while superimposing the information from individual grasps after cross-coding proprioceptive information for different parts of the hand and the registrations of tactile sensors. The cross-coding is done by employing either the tensor-product operation or a novel self-organizing neural network called the tensor multiple peak SOM (T-MPSOM). Second, we present a system based on proprioception that uses an anthropomorphic robot hand, i.e., the LUCS haptic-hand III. This system is able to distinguish objects both according to shape and size. Third, we present systems that are able to extract and combine the texture and hardness properties from explored materials.
Original languageEnglish
Pages (from-to)498-507
JournalIEEE Transactions on Robotics
Issue number3
Publication statusPublished - 2011

Subject classification (UKÄ)

  • Computer Vision and Robotics (Autonomous Systems)


  • Cognitive robotics
  • manipulators
  • self-organizing feature maps
  • tactile sensors
  • unsupervised learning


Dive into the research topics of 'Sense of Touch in Robots with Self-Organizing Maps'. Together they form a unique fingerprint.

Cite this