Please use this identifier to cite or link to this item: http://repositorio.unicamp.br/jspui/handle/REPOSIP/105436
Type: Artigo de periódico
Title: Facial Movement Analysis In Asl
Author: Vogler C.
Goldenstein S.
Abstract: In the age of speech and voice recognition technologies, sign language recognition is an essential part of ensuring equal access for deaf people. To date, sign language recognition research has mostly ignored facial expressions that arise as part of a natural sign language discourse, even though they carry important grammatical and prosodic information. One reason is that tracking the motion and dynamics of expressions in human faces from video is a hard task, especially with the high number of occlusions from the signers' hands. This paper presents a 3D deformable model tracking system to address this problem, and applies it to sequences of native signers, taken from the National Center of Sign Language and Gesture Resources (NCSLGR), with a special emphasis on outlier rejection methods to handle occlusions. The experiments conducted in this paper validate the output of the face tracker against expert human annotations of the NCSLGR corpus, demonstrate the promise of the proposed face tracking framework for sign language data, and reveal that the tracking framework picks up properties that ideally complement human annotations for linguistic research. © Springer-Verlag 2007.
Editor: 
Rights: fechado
Identifier DOI: 10.1007/s10209-007-0096-6
Address: http://www.scopus.com/inward/record.url?eid=2-s2.0-38949213461&partnerID=40&md5=fc29d87ee7a93002096539f5a9d82583
Date Issue: 2008
Appears in Collections:Unicamp - Artigos e Outros Documentos

Files in This Item:
File SizeFormat 
2-s2.0-38949213461.pdf494.1 kBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.