Template based human pose and shape estimation from a single RGB-D image

Forskningsoutput: Kapitel i bok/rapport/Conference proceedingKonferenspaper i proceedingPeer review

Sammanfattning

Estimating the 3D model of the human body is needed for many applications. However, this is a challenging problem since the human body inherently has a high complexity due to self-occlusions and articulation. We present a method to reconstruct the 3D human body model from a single RGB-D image. 2D joint points are firstly predicted by a CNN-based model called convolutional pose machine, and the 3D joint points are calculated using the depth image. Then, we propose to utilize both 2D and 3D joint points, which provide more information, to fit a parametric body model (SMPL). This is implemented through minimizing an objective function, which measures the difference of the joint points between the observed model and the parametric model. The pose and shape parameters of the body are obtained through optimization and the final 3D model is estimated. The experiments on synthetic data and real data demonstrate that our method can estimate the 3D human body model correctly.

Originalspråkengelska
Titel på värdpublikationICPRAM 2019 - Proceedings of the 8th International Conference on Pattern Recognition Applications and Methods
RedaktörerAna Fred, Maria De Marsico, Gabriella Sanniti di Baja
FörlagSciTePress
Sidor574-581
Antal sidor8
ISBN (elektroniskt)9789897583513
DOI
StatusPublished - 2019
Evenemang8th International Conference on Pattern Recognition Applications and Methods, ICPRAM 2019 - Prague, Tjeckien
Varaktighet: 2019 feb. 192019 feb. 21

Konferens

Konferens8th International Conference on Pattern Recognition Applications and Methods, ICPRAM 2019
Land/TerritoriumTjeckien
OrtPrague
Period2019/02/192019/02/21

Ämnesklassifikation (UKÄ)

  • Datorseende och robotik (autonoma system)

Fingeravtryck

Utforska forskningsämnen för ”Template based human pose and shape estimation from a single RGB-D image”. Tillsammans bildar de ett unikt fingeravtryck.

Citera det här