The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. ex. Some numerals are expressed as "XNUMX".
Copyrights notice
The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. Copyrights notice
본 논문에서는 얼굴 식별과 표정 인식의 통합을 제안합니다. 얼굴은 노드가 얼굴 특징점을 나타내는 그래프로 모델링됩니다. 이 모델은 자동 얼굴 및 얼굴 특징점 검출에 사용되며 유연한 특징점 매칭을 적용하여 얼굴 특징점을 추적합니다. 얼굴 식별은 입력된 얼굴 이미지를 나타내는 그래프를 개별 얼굴 모델과 비교하여 수행됩니다. 얼굴 특징점의 움직임과 표정 변화 사이의 관계를 찾아 얼굴 표정을 모델링합니다. 개별 및 평균 표정 모델을 생성한 후 적절한 범주에 따른 얼굴 표정과 표정 변화 정도를 식별하는 데 사용됩니다. 얼굴 표정 인식에 사용되는 표정 모델은 얼굴 식별 결과에 따라 선택됩니다.
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
부
Dadet PRAMADIHANTO, Yoshio IWAI, Masahiko YACHIDA, "Integrated Person Identification and Expression Recognition from Facial Images" in IEICE TRANSACTIONS on Information,
vol. E84-D, no. 7, pp. 856-866, July 2001, doi: .
Abstract: In this paper we propose an integration of face identification and facial expression recognition. A face is modeled as a graph where the nodes represent facial feature points. This model is used for automatic face and facial feature point detection, and facial feature points tracked by applying flexible feature matching. Face identification is performed by comparing the graphs representing the input face image with individual face models. Facial expression is modeled by finding the relationship between the motion of facial feature points and expression change. Individual and average expression models are generated and then used to identify facial expressions under appropriate categories and the degree of expression changes. The expression model used for facial expression recognition is chosen by the results of face identification.
URL: https://global.ieice.org/en_transactions/information/10.1587/e84-d_7_856/_p
부
@ARTICLE{e84-d_7_856,
author={Dadet PRAMADIHANTO, Yoshio IWAI, Masahiko YACHIDA, },
journal={IEICE TRANSACTIONS on Information},
title={Integrated Person Identification and Expression Recognition from Facial Images},
year={2001},
volume={E84-D},
number={7},
pages={856-866},
abstract={In this paper we propose an integration of face identification and facial expression recognition. A face is modeled as a graph where the nodes represent facial feature points. This model is used for automatic face and facial feature point detection, and facial feature points tracked by applying flexible feature matching. Face identification is performed by comparing the graphs representing the input face image with individual face models. Facial expression is modeled by finding the relationship between the motion of facial feature points and expression change. Individual and average expression models are generated and then used to identify facial expressions under appropriate categories and the degree of expression changes. The expression model used for facial expression recognition is chosen by the results of face identification.},
keywords={},
doi={},
ISSN={},
month={July},}
부
TY - JOUR
TI - Integrated Person Identification and Expression Recognition from Facial Images
T2 - IEICE TRANSACTIONS on Information
SP - 856
EP - 866
AU - Dadet PRAMADIHANTO
AU - Yoshio IWAI
AU - Masahiko YACHIDA
PY - 2001
DO -
JO - IEICE TRANSACTIONS on Information
SN -
VL - E84-D
IS - 7
JA - IEICE TRANSACTIONS on Information
Y1 - July 2001
AB - In this paper we propose an integration of face identification and facial expression recognition. A face is modeled as a graph where the nodes represent facial feature points. This model is used for automatic face and facial feature point detection, and facial feature points tracked by applying flexible feature matching. Face identification is performed by comparing the graphs representing the input face image with individual face models. Facial expression is modeled by finding the relationship between the motion of facial feature points and expression change. Individual and average expression models are generated and then used to identify facial expressions under appropriate categories and the degree of expression changes. The expression model used for facial expression recognition is chosen by the results of face identification.
ER -