Evaluation of Eye-Blinking Dynamics in Human Emotion Recognition using Weighted Visibility Graph
Abstract
Purpose: Designing an automated emotion recognition system using biosignals has become a hot and challenging issue in many fields, including human-computer interferences, robotics, and affective computing. Several algorithms have been proposed to characterize the internal and external behaviors of the subjects in confronting emotional events/stimuli. Eye movements, as an external behavior, are habitually analyzed in a multi-modality system using classic statistical measures, and the evaluation of its dynamics has been neglected so far.
Materials and Methods: This experiment intended to provide an innovative single-modality scheme for emotion classification using eye-blinking data. The dynamics of eye-blinking data have been characterized by weighted visibility graph-based indices. The extracted measures were then fed to the different classifiers, including support vector machine, decision tree, k-Nearest neighbor, Adaptive Boosting, and random subset to complete the process of classifying sad, happy, neutral, and fearful affective states. The scheme has been evaluated utilizing the available signals in the SEED-IV database.
Results: The proposed framework provided significant performance in terms of recognition rates. The highest average recognition rates of > 90% were achieved using the decision tree.
Conclusion: In brief, our results showed that eye-blinking data has the potential for emotion recognition. The present system can be extended for designing future affect recognition systems.
[2] Sieb, R. The Emergence of Emotions. Act Nerv Super 55, 115–145 (2013). https://doi.org/10.1007/BF03379732
[3] Spezialetti M, Placidi G, Rossi S. Emotion Recognition for Human-Robot Interaction: Recent Advances and Future Perspectives. Front Robot AI 2020; 7:532279. https://www.frontiersin.org/article/10.3389/frobt.2020.532279
[4] Singh MI, Singh M. Development of a real time emotion classifier based on evoked EEG. Biocybern Biomed Eng 2017; 37(3): 498-509.
[5] Becker H, Fleureau J, Guillotel P, Wendling F, Merlet I, Albera L. Emotion recognition based on high-resolution EEG recordings and reconstructed brain sources. IEEE Trans Affect Comput 2020; 11(2): 244 - 257.
[6] Goshvarpour A, Goshvarpour A. “A novel approach for EEG electrode selection in automated emotion recognition based on lagged Poincare's indices and sLORETA,” Cognitive Computation 12, 602–618; 2020.
[7] Goshvarpour A, Goshvarpour A. “EEG spectral powers and source localization in depressing, sad, and fun music videos focusing on gender differences”, Cognitive Neurodynamics 13(2), 161–173; 2019.
[8] Nardelli M, Valenza G, Greco A, Lanata A, Scilingo EP. Recognizing Emotions Induced by Affective Sounds through Heart Rate Variability. IEEE Trans Affect Comput 2015; 6(4): 385–394
[9] Hsu Y-L, Wang J-S, Chiang W-C, Hung C-H. Automatic ECG-Based Emotion Recognition in Music Listening. IEEE Trans Affect Comput 2020; 11(1): 85 - 99
[10] Goshvarpour A, Abbasi A, Goshvarpour A. Do men and women have different ECG responses to sad pictures? Biomed Signal Process Control 2017; 38: 67-73.
[11] Goshvarpour A, Abbasi A, Goshvarpour A. An accurate emotion recognition system using ECG and GSR signals and matching pursuit method. Biomedical Journal 2017; 40: 355-368
[12] Goshvarpour A, Abbasi A, Goshvarpour A, Daneshvar S. Discrimination between different emotional states based on the chaotic behavior of galvanic skin responses. SIViP 2017; 11(7): 1347-1355.
[13] Goshvarpour A, Goshvarpour A. “The potential of photoplethysmogram and galvanic skin response in emotion recognition using nonlinear features,” Physical and Engineering Sciences in Medicine 43:119–134; 2020.
[14] Goshvarpour A, Goshvarpour A. “Poincaré's section analysis for PPG-based automatic emotion recognition” Chaos, Solitons and Fractals 114, 400-407; 2018
[15] Goshvarpour A, Goshvarpour A. “Evaluation of novel entropy-based complex wavelet sub-bands measures of PPG in an emotion recognition system” Journal of Medical and Biological Engineering 2020; 40: 451–461.
[16] Gruebler A, Suzuki K. Design of a Wearable Device for Reading Positive Expressions from Facial EMG Signals. IEEE Trans Affect Comput 2014; 5(3): 227–237.
[17] Y. Lu, W.-L. Zheng, B. Li, and B.-L. Lu, “Combining eye movements and EEG to enhance emotion recognition,” in Proc. Int. Joint Conf. Artif. Intell., 2015, pp. 1170–1176.
[18] W.-L. Zheng, W. Liu, Y. Lu, B.-L. Lu, and A. Cichocki, “Emotionmeter: A multimodal framework for recognizing human emotions,” IEEE Transactions on Cybernetics, 49(3): 1110–1122, 2019.
[19] Cohn JF, Xiao J, Moriyama T, Ambadar Z, Kanade T. Automatic recognition of eye blinking in spontaneously occurring behavior, Behav Res Methods Instrum Comput. 35 (2003) 420–428.
[20] Al-gawwam S, Benaissa M. Depression Detection from Eye Blink Features, in IEEE International Symposium on Signal Processing and Information Technology (ISSPIT), Louisville, KY, USA, 2018, pp. 388-392.
[21] Bek J, Poliakoff E, Lander K, Measuring emotion recognition by people with Parkinson’s disease using eye-tracking with dynamic facial expressions, Journal of Neuroscience Methods 2020; 33(1): 108524.
[22] Sonawane B, Sharma P. Review of automated emotion-based quantification of facial expression in Parkinson’s patients. Vis Comput 37, 1151–1167 (2021). https://doi.org/10.1007/s00371-020-01859-9
[23] Martin-Key NA, Graf EW, Adams WJ, Fairchild G. Facial emotion recognition and eye movement behaviour in conduct disorder. J Child Psychol Psychiatry. 2018;59(3):247-257.
[24] Black MH, Chen NTM, Iyer KK, et al. Mechanisms of facial emotion recognition in autism spectrum disorders: Insights from eye tracking and electroencephalography. Neurosci Biobehav Rev. 2017; 80: 488-515.
[25] Dornbach-Bender A, Ruggero C.J, Bain K.M., Smith P., Schuler K.L., Smotherman J.M., Callahan J.L., Attention for emotion associated with hypomanic personality traits: Eye-tracking reveals a positive bias independent of mood, Neurology, Psychiatry and Brain Research, 2019; 32: 30-35
[26] Alghowinem S, AlShehri M, Goecke R, Wagner M. Exploring eye activity as an indication of emotional states using an eye-tracking sensor, Intel. Sys. Sci. Inform., Ed: Springer, 2014, pp. 261–276.
[27] P. Tarnowski, M. Kołodziej, A. Majkowski and R. Jan Rak, Eye-Tracking Analysis for Emotion Recognition, Comput Intell Neurosci. 2020 (2020) 2909267. https://doi.org/10.1155/2020/2909267
[28] Savi, MA. Chaos and order in biomedical rhythms, J. Braz. Soc. Mech. Sci. & Eng. [online]. 2005; 27(2): 157-169. https://doi.org/10.1590/S1678-58782005000200008.
[29] Zhang J, Small M. Complex Network from Pseudoperiodic Time Series: Topology versus Dynamics, Phys. Rev. Lett. 2006; 96(23): 238701.
[30] Lacasa L, Luque B, Ballesteros F, Luque J, Nuno J. From time series to complex networks: The visibility graph, Proceedings of the National Academy of Sciences 2008; 105(13): 4972-4975.
[31] Luque B, Lacasa L, Ballesteros F, Luque J. Horizontal visibility graphs: exact results for random time series, Phys. Rev. E. 2009; 80(4): 046103.
[32] Bhaduri A, Bhaduri S, Ghosh D. Visibility graph analysis of heart rate time series and bio-marker of congestive heart failure, Physica A: Statistical Mechanics and its Applications 2017; 482: 786-795.
[33] Mozaffarilegha M, Adeli H. Visibility graph analysis of speech evoked auditory brainstem response in persistent developmental stuttering, Neuroscience Letters 2019; 696: 28-32.
[34] Supriya S, Siuly S, Wang H, Cao J, Zhang Y. Weighted Visibility Graph With Complex Network Features in the Detection of Epilepsy, IEEE Access 2016; 4: 6554-6566.
[35] Bose R, Samanta K, Modak S, Chatterjee S. Augmenting Neuromuscular Disease Detection Using Optimally Parameterized Weighted Visibility Graph, IEEE Journal of Biomedical and Health Informatics, doi: 10.1109/JBHI.2020.3001877.
[36] Diykh M, Li Y, Abdulla S. EEG sleep stages identification based on weighted undirected complex networks, Computer Methods and Programs in Biomedicine 2020; 184: 105116.
[37] Zhu G, Li Y, Wen P. Analysis and Classification of Sleep Stages Based on Difference Visibility Graphs From a Single-Channel EEG Signal, IEEE J Biomed Health Inform 2014; 18(6): 1813-1821. doi: 10.1109/JBHI.2014.2303991.
[38] Wang R, Yang Z, Wang J, Shi L. An Improved Visibility Graph Analysis of EEG Signals of Alzheimer Brain, 11th International Congress on Image and Signal Processing, BioMedical Engineering and Informatics (CISP-BMEI), Beijing, China, 2018, pp. 1-5.
[39] Choudhary GI, Aziz W, Khan IR, Rahardja S, Fränti P. Analysing the Dynamics of Interbeat Interval Time Series Using Grouped Horizontal Visibility Graph, IEEE Access 2019; 7: 9926-9934. doi: 10.1109/ACCESS.2018.2890542.
[40] Bhaduri S, Chakraborty A, Ghosh D. Speech emotion quantification with chaos-based modified visibility graph-possible precursor of suicidal tendency. Journal of Neurology and Neuroscience 2016; 7(3):100.
[41] Samanta K, Chatterjee S, Bose R. Cross-Subject Motor Imagery Tasks EEG Signal Classification Employing Multiplex Weighted Visibility Graph and Deep Feature Extraction, IEEE Sens. Lett. 2020; 4(1): 7000104. doi: 10.1109/LSENS.2019.2960279.
[42] Iglin S. (2020). grTheory - Graph Theory Toolbox (https://www.mathworks.com/matlabcentral/fileexchange/4266-grtheory-graph-theory-toolbox), MATLAB Central File Exchange. Retrieved July 6, 2020.
[43] Floyd RW. Algorithm 97: shortest path. Communications of the ACM 1962; 5(6): 345.
[44] Gholami R, Fakhari N. Chapter 27 - Support Vector Machine: Principles, Parameters, and Applications. Samui P, Sekhar S, Balas VE (Editors), Handbook of Neural Computation. Academic Press, 2017, Pages 515-535. https://doi.org/10.1016/B978-0-12-811318-9.00027-2.
[45] J. Han, J. Pei, M. Kamber, Data mining: Concepts and techniques. 3rd Edition, Elsevier, 2011.
[46] D.T. Larose, Discovering knowledge in data: An introduction to data mining. John Wiley & Sons, 2014.
[47] J. Guo, R. Zhou, L. Zhao and B. Lu, Multimodal Emotion Recognition from Eye Image, Eye Movement and EEG Using Deep Neural Networks, in Annu Int Conf IEEE Eng Med Biol Soc. Berlin, Germany, 2019, pp. 3071-3074.
[48] P. S. Lamba, D. Virmani. Information Retrieval from Emotions and Eye Blinks with help of Sensor Nodes, Int. J. Electr. Comput. Eng. 8 (2018) 2433-2441.
[49] Paul S, Banerjee A, Tibarewala DN. Emotional eye movement analysis using electrooculography signal. Int J Biomed Eng Technol 2017; 23: 59–70.
[50] Bao LQ, Qiu JL, Tang H, Zheng WL, Lu BL. Investigating Sex Differences in Classification of Five Emotions from EEG and Eye Movement Signals. Annu Int Conf IEEE Eng Med Biol Soc. 2019 Jul;2019:6746-6749. doi: 10.1109/EMBC.2019.8857476. PMID: 31947389.
[51] Wang Y, Lv Z, Zheng Y. Automatic Emotion Perception Using Eye Movement Information for E-Healthcare Systems. Sensors (Basel). 2018;18(9):2826. Published 2018 Aug 27. doi:10.3390/s18092826
[52] Su Y, Li W, Bi N, Lv Z. Adolescents Environmental Emotion Perception by Integrating EEG and Eye Movements. Front Neurorobot. 2019;13:46. Published 2019 Jun 26. doi:10.3389/fnbot.2019.00046
[53] Zheng WL, Dong BN, Lu BL. Multimodal emotion recognition using EEG and eye tracking data. Annu Int Conf IEEE Eng Med Biol Soc. 2014;2014:5040-3. doi: 10.1109/EMBC.2014.6944757. PMID: 25571125.
[54] Liu W, Qiu JL, Zheng W-L, Lu B-L. Multimodal Emotion Recognition Using Deep Canonical Correlation Analysis. arXiv:1908.05349v1 [cs.LG] 13 Aug 2019.
[55] Lim JZ, Mountstephens J, Teo J. Emotion Recognition Using Eye-Tracking: Taxonomy, Review and Current Challenges. Sensors. 2020; 20(8):2384. https://doi.org/10.3390/s20082384
Files | ||
Issue | Vol 11 No 2 (2024) | |
Section | Original Article(s) | |
DOI | https://doi.org/10.18502/fbt.v11i2.15344 | |
Keywords | ||
Dynamics Emotion Recognition Eye-Blinking Weighted Visibility Graph |
Rights and permissions | |
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License. |