Using GNG to improve 3D features extractio - Application to 6DoF Egomotion

Research output: Contribution to journalArticle


Abstract in Undetermined
Several recent works deal with 3D data in mobile robotic problems, e.g. mapping or egomotion. Data comes from any kind of sensor such as stereo vision systems, time of flight cameras or 3D lasers, providing a huge amount of unorganized 3D data. In this paper, we describe an efficient method to build complete 3D models from a Growing Neural Gas (GNG). The GNG is applied to the 3D raw data and it reduces both the subjacent error and the number of points, keeping the topology of the 3D data. The GNG output is then used in a 3D feature extraction method. We have performed a deep study in which we quantitatively show that the use of GNG improves the 3D feature extraction method. We also show that our method can be applied to any kind of 3D data. The 3D features obtained are used as input in an Iterative Closest Point (ICP)-like method to compute the 6DoF movement performed by a mobile robot. A comparison with standard ICP is performed, showing that the use of GNG improves the results. Final results of 3D mapping from the egomotion calculated are also shown. (C) 2012 Elsevier Ltd. All rights reserved.


  • Diego Viejo
  • Jose Garcia
  • Miguel Cazorla
  • David Gil
  • Magnus Johnsson
External organisations
  • University of Alicante
Research areas and keywords

Subject classification (UKÄ) – MANDATORY

  • Medical Biotechnology
  • Computer Vision and Robotics (Autonomous Systems)


  • GNG, Egomotion, 3D feature extraction, 6DoF registration
Original languageEnglish
Pages (from-to)138-146
JournalNeural Networks
Publication statusPublished - 2012
Publication categoryResearch

Related projects

View all (1)