Automated detection of pain levels using deep feature extraction from shutter blinds‑based dynamic‑sized horizontal patches with facial images
Article
Article Title | Automated detection of pain levels using deep feature extraction from shutter blinds‑based dynamic‑sized horizontal patches with facial images |
---|---|
ERA Journal ID | 201487 |
Article Category | Article |
Authors | Barua, Prabal Datta, Baygin, Nursena, Dogan, Sengul, Baygin, Mehmet, Arunkumar, N., Fujita, Hamido, Tuncer, Turker, Tan, Ru‑San, Palmer, Elizabeth, Azizan, Muhammad Mokhzaini Bin, Kadri, Nahrizul Adib and Acharya, U. Rajendra |
Journal Title | Scientific Reports |
Journal Citation | 12 (1) |
Article Number | 17297 |
Number of Pages | 13 |
Year | 2022 |
Publisher | Nature Publishing Group |
Place of Publication | United Kingdom |
ISSN | 2045-2322 |
Digital Object Identifier (DOI) | https://doi.org/10.1038/s41598-022-21380-4 |
Web Address (URL) | https://www.nature.com/articles/s41598-022-21380-4 |
Abstract | Pain intensity classification using facial images is a challenging problem in computer vision research. This work proposed a patch and transfer learning-based model to classify various pain intensities using facial images. The input facial images were segmented into dynamic-sized horizontal patches or “shutter blinds”. A lightweight deep network DarkNet19 pre-trained on ImageNet1K was used to generate deep features from the shutter blinds and the undivided resized segmented input facial image. The most discriminative features were selected from these deep features using iterative neighborhood component analysis, which were then fed to a standard shallow fine k-nearest neighbor classifier for classification using tenfold cross-validation. The proposed shutter blinds-based model was trained and tested on datasets derived from two public databases—University of Northern British Columbia-McMaster Shoulder Pain Expression Archive Database and Denver Intensity of Spontaneous Facial Action Database—which both comprised four pain intensity classes that had been labeled by human experts using validated facial action coding system methodology. Our shutter blinds-based classification model attained more than 95% overall accuracy rates on both datasets. The excellent performance suggests that the automated pain intensity classification model can be deployed to assist doctors in the non-verbal detection of pain using facial images in various situations (e.g., non-communicative patients or during surgery). This system can facilitate timely detection and management of pain. |
Contains Sensitive Content | Does not contain sensitive content |
ANZSRC Field of Research 2020 | 400306. Computational physiology |
Byline Affiliations | Ngee Ann Polytechnic, Singapore |
Singapore University of Social Sciences (SUSS), Singapore | |
Asia University, Taiwan | |
School of Business | |
University of Technology Sydney | |
Kafkas University, Turkiye | |
Firat University, Turkey | |
Ardahan University, Turkiye | |
Rathinam College of Engineering, India | |
HUTECH University of Technology, Vietnam | |
University of Granada, Spain | |
Iwate Prefectural University, Japan | |
National Heart Centre, Singapore | |
Duke-NUS Medical School, Singapore | |
Department of Health, New South Wales | |
University of New South Wales | |
Islamic Science University of Malaysia, Malaysia | |
University of Malaya, Malaysia |
https://research.usq.edu.au/item/yyq0x/automated-detection-of-pain-levels-using-deep-feature-extraction-from-shutter-blinds-based-dynamic-sized-horizontal-patches-with-facial-images
Download files
Published Version
61
total views24
total downloads0
views this month0
downloads this month