A new context-based feature for classification of emotions in photographs

Krishnani, Divya and Shivakumara, Palaiahnakote and Lu, Tong and Pal, Umapada and Lopresti, Daniel and Kumar, Govindaraju Hemantha (2021) A new context-based feature for classification of emotions in photographs. Multimedia Tools and Applications, 80 (10). pp. 15589-15618. ISSN 1380-7501, DOI https://doi.org/10.1007/s11042-020-10404-8.

Full text not available from this repository.
Official URL: https://doi.org/10.1007/s11042-020-10404-8

Abstract

A high volume of images is shared on the public Internet each day. Many of these are photographs of people with facial expressions and actions displaying various emotions. In this work, we examine the problem of classifying broad categories of emotions based on such images, including Bullying, Mildly Aggressive, Very Aggressive, Unhappy, Disdain and Happy. This work proposes the Context-based Features for Classification of Emotions in Photographs (CFCEP). The proposed method first detects faces as a foreground component, and other information (non-face) as background components to extract context features. Next, for each foreground and background component, we explore the Hanman transform to study local variations in the components. The proposed method combines the Hanman transform (H) values of foreground and background components according to their merits, which results in two feature vectors. The two feature vectors are fused by deriving weights to generate one feature vector. Furthermore, the feature vector is fed to a CNN classifier for classification of images of different emotions uploaded on social media and public internet. Experimental results on our dataset of different emotion classes and the benchmark dataset show that the proposed method is effective in terms of average classification rate. It reports 91.7% for our 10-class dataset, 92.3% for 5 classes of standard dataset and 81.4% for FERPlus dataset. In addition, a comparative study with existing methods on the benchmark dataset of 5-classes, standard dataset of facial expression (FERPlus) and another dataset of 10-classes show that the proposed method is best in terms of scalability and robustness.

Item Type: Article
Funders: UNSPECIFIED
Uncontrolled Keywords: Social networking; Face detection; Hanman transform; Person emotions; Personality behavior
Subjects: T Technology > T Technology (General)
Divisions: Faculty of Computer Science & Information Technology > Department of Computer System & Technology
Depositing User: Ms. Juhaida Abd Rahim
Date Deposited: 16 Feb 2022 03:53
Last Modified: 16 Feb 2022 07:05
URI: http://eprints.um.edu.my/id/eprint/26170

Actions (login required)

View Item View Item