Explainable deep learning ensemble for food image analysis on edge devices

Tahir, Ghalib Ahmed and Loo, Chu Kiong (2021) Explainable deep learning ensemble for food image analysis on edge devices. Computers in Biology and Medicine, 139. ISSN 0010-4825, DOI https://doi.org/10.1016/j.compbiomed.2021.104972.

Full text not available from this repository.


Food recognition systems recently garnered much research attention in the relevant field due to their ability to obtain objective measurements for dietary intake. This feature contributes to the management of various chronic conditions. Challenges such as inter and intraclass variations alongside the practical applications of smart glasses, wearable cameras, and mobile devices require resource-efficient food recognition models with high classification performance. Furthermore, explainable AI is also crucial in health-related domains as it characterizes model performance, enhancing its transparency and objectivity. Our proposed architecture attempts to address these challenges by drawing on the strengths of the transfer learning technique upon initializing MobiletNetV3 with weights from a pre-trained model of ImageNet. The MobileNetV3 achieves superior performance using the squeeze and excitation strategy, providing unequal weight to different input channels and contrasting equal weights in other variants. Despite being fast and efficient, there is a high possibility for it to be stuck in the local optima like other deep neural networks, reducing the desired classification performance of the model. Thus, we overcome this issue by applying the snapshot ensemble approach as it enables the M model in a single training process without any increase in the required training time. As a result, each snapshot in the ensemble visits different local minima before converging to the final solution which enhances recognition performance. On overcoming the challenge of explainability, we argue that explanations cannot be monolithic, since each stakeholder perceive the results', explanations based on different objectives and aims. Thus, we proposed a user centered explainable artificial intelligence (AI) framework to increase the trust of the involved parties by inferencing and rationalizing the results according to needs and user profile. Our framework is comprehensive in terms of a dietary assessment app as it detects Food/Non-Food, food categories, and ingredients. Experimental results on the standard food benchmarks and newly contributed Malaysian food dataset for ingredient detection demonstrated superior performance on an integrated set of measures over other methodologies.

Item Type: Article
Funders: Universiti Malaya[RK012-2019], University Malaya, Malaysia[IIRG002C-19HWB], Ministry of Energy, Science, Technology, Environment and Climate Change (MESTECC), Malaysia[IF0318M1006], ONRG, United States[IF017-2018]
Uncontrolled Keywords: User-centred explainable AI; Food recognition; Deep learning; Neural network; Ensemble learning; Mobile application; Data augmentation; Explainable AI
Subjects: Q Science > Q Science (General)
R Medicine > R Medicine (General)
T Technology > T Technology (General)
Divisions: Faculty of Computer Science & Information Technology
Depositing User: Ms Zaharah Ramly
Date Deposited: 14 Apr 2022 01:53
Last Modified: 14 Apr 2022 01:53
URI: http://eprints.um.edu.my/id/eprint/26898

Actions (login required)

View Item View Item