Autism classification and monitoring from predicted categorical and dimensional emotions of video features

Khor, Stephen Wen Hwooi and Md Sabri, Aznul Qalid and Othmani, Alice (2024) Autism classification and monitoring from predicted categorical and dimensional emotions of video features. Signal, Image and Video Processing, 18 (1). 191 – 198. ISSN 1863-1703, DOI https://doi.org/10.1007/s11760-023-02699-5.

Full text not available from this repository.

Abstract

Autism in children has been increasing at an alarming rate over the years, and currently 1 of children struggle with this disorder. It can be better managed via early diagnosis and treatment. Autistic children are characterised by deficiencies in communicative and social capabilities and are most commonly identified by their stimming behaviours. Therefore, it is helpful to understand their emotions when they are exhibiting this type of behaviour. However, most of the current affect recognition approaches majorly focus on predicting either exclusively on basic categories of emotion, or continuous emotions. We propose an approach which maps basic categories of emotion to continuous dimensional emotions, opening more avenues for understanding emotions of autistic children. In our approach, we first predict the basic emotion category with a convolutional neural network, followed by continuous emotion prediction by a deep regression model. Moreover, our method is deployed as a web application for visual video monitoring. For autism analysis, we performed image-based and video-based classification of stimming behaviours using the extracted behavioural and emotional features. Our emotion classifier was able to achieve a competitive F1-score, while our regression model performed excellently in terms of CCC and RMSE compared with existing methods. Image-based analysis of autism did not yield meaningful classification when using emotional features but it provided useful cues when dealing with textural features. In video-based autism analysis, our chosen clustering algorithm was able to classify stimming behaviours into different clusters, each cluster demonstrating a dominant emotion category. © 2023, The Author(s), under exclusive licence to Springer-Verlag London Ltd., part of Springer Nature.

Item Type: Article
Funders: Ministry of Higher Education, Malaysia [Grant no. 1000/016/018/25], Universiti Malaya [Grant no. KKP002-2021]
Uncontrolled Keywords: Affect representation; ASD children monitoring; ASD diagnosis; Autism spectrum disorder; Deep learning; Emotion recognition; Facial expression quantification
Subjects: Q Science > QA Mathematics > QA75 Electronic computers. Computer science
Divisions: Faculty of Computer Science & Information Technology > Department of Artificial Intelligence
Depositing User: Ms. Juhaida Abd Rahim
Date Deposited: 18 Jun 2024 07:52
Last Modified: 18 Jun 2024 07:52
URI: http://eprints.um.edu.my/id/eprint/44856

Actions (login required)

View Item View Item