WestminsterResearch will not be accepting deposits until 9th March 2015. This is to allow for a system upgrade and server migration.

Corresponding dynamic appearances

Gong, Shaogang and Psarrou, Alexandra and Romdani, Sami (2002) Corresponding dynamic appearances. Image and Vision Computing, 20 (4). pp. 307-318. ISSN 0262-8856

Full text not available from this repository.

Official URL: http://dx.doi.org/10.1016/S0262-8856(02)00025-2


Modelling the appearance of 3D objects undergoing large pose variation relies on recovering correspondence of both shape and texture across views. The problem is hard because changes in pose not only introduce self-occlusions hence inconsistent 2D features between views, but also cause non-linear variations in both the shape and texture of object appearance. In this paper, we present an approach for establishing structured sparse correspondence between face images across views using non-linear shape models. We extend the non-linear shape models to dynamic appearance models of both shape and texture across views. For non-linear model transformation, we adopt Kernel PCA. For bootstrapping appearance alignment at different views, we introduce a generic-view shape template. We show that Kernel PCA constrained the dynamic appearance model and eases model fitting to novel images.

Item Type:Article
Uncontrolled Keywords:View-based representation, Appearance models, The correspondence problem, Active shape models, Support vector machines, Kernel principal components analysis
Research Community:University of Westminster > Electronics and Computer Science, School of
ID Code:532
Deposited On:26 Sep 2005
Last Modified:14 Oct 2009 12:45

Repository Staff Only: item control page