Identification of Fatigue from Facial Expressions Using Transfer Learning
DOI:
https://doi.org/10.69616/mcs.v1i1.180Keywords:
Disengagement, Facial Expressions, FER2013, OpenCV, Transfer LearningAbstract
Initially, teaching and learning activities were carried out face-to-face in the provided room, but now they have switched to online. Online learning has an impact on student learning disengagement, which is known through indicators of aspects of emotional exhaustion, physical fatigue, cognitive fatigue, and loss of motivation. Besides, the teacher must provide the material that has been provided. The teacher must also pay attention to all students who are participating in the online learning. This can be overcome by a system that can detect student disengagement using a camera device. The system works by scanning the direction of students' faces and views using OpenCV technology and Transfer Learning methods. Using context, facial expressions, and heart rate can be used to recognize student disengagement. However, with the widespread availability of cameras, it is easier to identify disengagement using facial expressions. The facial expression recognition system in this study will use the FER2013 dataset and Transfer Learning method. Facial expression recognition using the FER-2013 dataset and Transfer Learning method has a reading accuracy rate of 68% in 25 epochs. Then, after being implemented as an impression parameter in the disengagement identification system using 7 scenarios, the accuracy rate is 83.33%, precision is 100%, recall is 75%, and the f1-score is 85.71%.
References
A. H. Rustaman, “Efektivitas penggunaan aplikasi daring, video conference dan sosial media pada mata kuliah komputer grafis 1 di masa pandemi covid- 19,” JISIP (Jurnal Ilmu Sosial dan Pendidikan), vol. 4, no. 3, 2020.
R. Pawicara and M. Conilie, “Analisis pembelajaran daring terhadap kejenuhan belajar mahasiswa Tadris Biologi IAIN Jember di tengah pandemi Covid-19,” ALVEOLI: Jurnal Pendidikan Biologi, vol. 1, no. 1, pp. 29–38, 2020.
N. Alyuz et al., “Semi-supervised model personalization for improved detection of learner’s emotional engagement,” in Proceedings of the 18th ACM International Conference on Multimodal Interaction, 2016, pp. 100– 107.
J. Whitehill, Z. Serpell, Y.-C. Lin, A. Foster, and J. R. Movellan, “The faces of engagement: Automatic recognition of student engagementfrom facial expressions,” IEEE Transactions on Affective Computing, vol. 5, no. 1, pp. 86–98, 2014.
H. Monkaresi, N. Bosch, R. A. Calvo, and S. K. D’Mello, “Automated detection of engagement using video-based estimation of facial expressions and heart rate,” IEEE Transactions on Affective Computing, vol. 8, no. 1, pp. 15–28, 2016.
D. L. Z. Astuti and S. Samsuryadi, “Kajian Pengenalan Ekspresi Wajah menggunakan Metode PCA dan CNN,” in Annual Research Seminar (ARS), 2019, vol. 4, no. 1, pp. 293–297.
D. Prasetyawan, “Penentuan Emosi pada Video dengan Convolutional Neural Network,” JISKA (Jurnal Informatika Sunan Kalijaga), vol. 5, no. 1, pp. 23–35, 2020.
J. Chen, F. Lécué, J. Z. Pan, I. Horrocks, and H. Chen, “Knowledge-based transfer learning explanation,” arXiv preprint arXiv:1807.08372, 2018.
O. M. Nezami, M. Dras, L. Hamey, D. Richards, S. Wan, and C. Paris, “Automatic recognition of student engagement using deep learning and facial expression,” in Joint European Conference on Machine Learning and Knowledge Discovery in Databases, 2019, pp. 273–289.
A. Zein, “Pendeteksian Kantuk Secara Real Time Menggunakan Pustaka Opencv Dan Dlib Python,” Sainstech: Jurnal Penelitian dan Pengkajian Sains dan Teknologi, vol. 28, no. 2, 2018.
K. Simonyan and A. Zisserman, “Very deep convolutional networks for large-scale image recognition,” arXiv preprint arXiv:1409.1556, 2014.
D. Triasanti, “Konsep Dasar Python,” Surabaya: Sulita Jaya, 2002.
I. Taufiq, “Deep Learning Untuk Deteksi Tanda Nomor Kendaraan Bermotor Menggunakan Algoritma Convolutional Neural Network Dengan Python Dan Tensorflow.” Skripsi. Program Studi Sistem Informasi Sekolah Tinggi Manajemen Informatika …, 2018.
F. Chollet, “Keras documentation,” keras. io, vol. 33, 2015.
K. D. P. Novianti, N. A. Setiawan, and S. S. Kusumawardani, “Peningkatan Nilai Recall dan Precision pada Penelusuran Informasi Pustaka Berbasis Semantik (Studi Kasus: Sistem Informasi Ruang Referensi Jurusan Teknik Elektro dan Teknologi Informasi UGM),” Proceedings Konferensi Nasional Sistem dan Informatika (KNS&I), 2015.
J. Grafsgaard, J. B. Wiggins, K. E. Boyer, E. N. Wiebe, and J. Lester, “Automatically recognizing facial expression: Predicting engagement and frustration,” 2013.
S. Aslan, N. Alyuz, E. Okur, E. M. Sinem, E. Oktay, and A. A. Esme, “Effect of emotion-aware interventions on students’ behavioral and emotional states,” Educational Technology, Research and Development, vol. 66, no. 6, pp. 1399–1413, 2018.
I. J. Goodfellow et al., “Challenges in representation learning: A report on three machine learning contests,” in International conference on neural information processing, 2013, pp. 117–124.
M. Wardhana and M. Hariadi, “Ekspresi emosi pada model wajah tiga dimensi menggunakan naive bayes dan logika fuzzy,” 2019.
S. A. H. Alrubaie and A. H. Hameed, “Dynamic weights equations for converting grayscale image to RGB image,” Journal of University of Babylon for Pure and Applied Sciences, vol. 26, no. 8, pp. 122–129, 2018.
T. Nakano, M. Kato, Y. Morito, S. Itoi, and S. Kitazawa, “Blink-related momentary activation of the default mode network while viewing videos,” Proceedings of the National Academy of Sciences, vol. 110, no. 2, pp. 702– 706, 2013.











