COMPARATIVE TRANSFER LEARNING FOR SIT AND STAND CLASSIFICATION: TOWARD SUSTAINABLE WORK-HEALTH SOLUTIONS

Kongfa Wamasing, Jermphiphut Jaruenpunyasak

Abstract


The development of sustainable health monitoring technologies is essential for improving patient care while reducing resource demands in healthcare systems. Furthermore, such technologies are crucial for addressing occupational health concerns among healthcare workers, who are at high risk of developing musculoskeletal disorders due to improper posture, prolonged standing, and incorrect sitting positions during work. Deep learning models have emerged as a promising tool for automatic posture classification and real-time monitoring of healthcare workers' activities. However, traditional deep learning approaches based on convolutional neural networks often require significant computational resources. To address this challenge, our research explores the potential of transfer learning techniques for efficient and accurate classification of sitting and standing postures among healthcare workers. In addition, our online dataset consisted of 4,000 annotated sitting and standing images with various hospital scenarios. We also evaluated pre-trained deep learning architectures on this dataset to assess their transferability for posture classification. Our results demonstrated that all transfer learning provided an efficient and sustainable solution, achieving high classification performance with all models exceeding 0.96 in accuracy, precision, recall, and F1-score. Notably, MobileNetV2 stands out as a highly efficient option, with only 3.5 million parameters and a weight size of 9 MB. In contrast, VGG16 and VGG19, despite their highest classification performance, were impractical for real-time applications due to their largest size exceeding 500 MB. Our findings contribute to the development of sustainable work-health solutions by enabling the integration of efficient deep learning models into monitoring systems for real-world hospital environments.

Full Text:

Untitled

Refbacks

  • There are currently no refbacks.