Araştırma Makalesi
BibTex RIS Kaynak Göster
Yıl 2025, Cilt: 13 Sayı: 2, 607 - 623, 01.06.2025
https://doi.org/10.36306/konjes.1649691

Öz

Kaynakça

  • M. G. Huddar, S. S. Sannakki, and V. S. Rajpurohit, "Attention-based multi-modal sentiment analysis and emotion detection in conversation using RNN," Int. J. Interact. Multimedia Artif. Intell, vol. 6, no. 6, June 2021, doi: 10.9781/ijimai.2020.07.004.
  • W. Rahmouni, G. Bachir, and M. Aillerie, "A new control strategy for harmonic reduction in photovoltaic inverters inspired by the autonomous nervous system," J. Electr. Eng., vol. 73, no. 5, pp. 310–317, September 2022, doi: 10.2478/jee-2022-0041.
  • L. George and H. Hadi, "User identification and verification from a pair of simultaneous EEG channels using transform-based features," Int. J. Interact. Multimedia Artif. Intell., vol. 5, no. 5,pp. 54-62, June 2019, doi: 10.9781/ijimai.2018.12.008.
  • J. Wang and M. Wang, "Review of the emotional feature extraction and classification using EEG signals," Cogn. Robot., vol. 1, pp. 29–40, April 2021, doi: 10.1016/j.cogr.2021.04.001.
  • W. L. Zheng and B. L. Lu, "Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks," IEEE Trans. Auton. Ment. Dev., vol. 7, no. 3, pp. 162–175, May 2015, doi: 10.1109/TAMD.2015.2431497.
  • R. N. Duan, J. Y. Zhu, and B. L. Lu, "Differential entropy feature for EEG-based emotion classification," in Proc. 6th Int. IEEE/EMBS Conf. Neural Eng. (NER), San Diego, CA, USA, November 2013, pp. 81–84, doi: 10.1109/NER.2013.6695889.
  • J. Li, Z. Zhang, and H. He, "Hierarchical convolutional neural networks for EEG-based emotion recognition," Cogn. Comput., vol. 10, pp. 368–380, April 2018, doi: 10.1007/s12559-017-9533-x.
  • M. A. Asghar et al., "EEG-based multi-modal emotion recognition using bag of deep features: An optimal feature selection approach," Sensors, vol. 19, no. 23, p. 5218, November 2019, doi: 10.3390/s19235218.
  • K. H. Cheah, H. Nisar, V. V. Yap, C. Y. Lee, and G. R. Sinha, "Optimizing residual networks and VGG for classification of EEG signals: Identifying ideal channels for emotion recognition," J. Healthcare Eng., vol. 1, pp. 1–10, March 2021, doi: 10.1155/2021/5599615.
  • G. Xiao et al., "4D attention-based neural network for EEG emotion recognition," Cogn. Neurodynamics, vol. 16, pp. 1–14, January 2022, doi: 10.1007/s11571-021-09751-5.
  • M. Jin, H. Chen, Z. Li, and J. Li, "EEG-based emotion recognition using graph convolutional network with learnable electrode relations," in Proc. 43rd Annu. Int. Conf. IEEE Eng. Med. Biol. Soc. (EMBC), Mexico, November 2021, pp. 5953–5957, doi: 10.1109/EMBC46164.2021.9630062.
  • S. Katsigiannis and N. Ramzan, "DREAMER: A database for emotion recognition through EEG and ECG signals from wireless low-cost off-the-shelf devices," IEEE J. Biomed. Health Inform., vol. 22, no. 1, pp. 98–107, March 2017, doi: 10.1109/JBHI.2017.2688239.
  • S. Koelstra et al., "DEAP: A database for emotion analysis using physiological signals," IEEE Trans. Affect. Comput., vol. 3, no. 1, pp. 18–31, December 2011, doi: 10.1109/T-AFFC.2011.15.
  • T. Song, W. Zheng, P. Song, and Z. Cui, "EEG emotion recognition using dynamical graph convolutional neural networks," IEEE Trans. Affect. Comput., vol. 11, no. 3, pp. 532–541, March 2018, doi: 10.1109/TAFFC.2018.2817622.
  • T. Zhang, X. Wang, X. Xu, and C. P. Chen, "GCB-Net: Graph convolutional broad network and its application in emotion recognition," IEEE Trans. Affect. Comput., vol. 13, no. 1, pp. 379–388, August 2019, doi: 10.1109/TAFFC.2019.2937768.
  • R. Li et al., "SSTD: A novel spatio-temporal demographic network for EEG-based emotion recognition," IEEE Trans. Comput. Soc. Syst., vol. 10, no. 1, pp. 376–387, January 2022, doi: 10.1109/TCSS.2022.3188891.
  • K. Lin, L. Zhang, J. Cai, J. Sun, W. Cui, and G. Liu, "DSE-Mixer: A pure multilayer perceptron network for emotion recognition from EEG feature maps," J. Neurosci. Methods, vol. 401, January 2024, doi: 10.1016/j.jneumeth.2023.110008.
  • Z. Gao, Y. Li, Y. Yang, X. Wang, N. Dong, and H. D. Chiang, "A GPSO-optimized convolutional neural networks for EEG-based emotion recognition," Neurocomputing, vol. 380, pp. 225–235, March 2020, doi: 10.1016/j.neucom.2019.10.096.
  • H. Chao and L. Dong, "Emotion recognition using three-dimensional feature and convolutional neural network from multichannel EEG signals," IEEE Sensors J., vol. 21, no. 2, pp. 2024–2034, September 2020, doi: 10.1109/JSEN.2020.3020828.
  • K. Martín-Chinea, J. Ortega, J. F. Gómez-González, E. Pereda, J. Toledo, & L. Acosta, “Effect of time windows in LSTM networks for EEG-based BCIs. Cognitive Neurodynamics”, vol. 17, no.2, 385-398, April 2023. doi: 10.1007/s11571-022-09832-z.
  • S. Zhou, B. Chen, Y. Zhang, H. Liu, Y. Xiao, and X. Pan, "A feature extraction method based on feature fusion and its application in the text-driven failure diagnosis field," Int. J. Interact. Multimedia Artif. Intell., vol. 6, no. 4, pp. 121-130, December 2020, doi: 10.9781/ijimai.2020.11.006.
  • E. Ergün and O. Aydemir, "A Hybrid BCI Using Singular Value Decomposition Values of the Fast Walsh–Hadamard Transform Coefficients," IEEE Trans. Cogn. Dev. Syst., vol. 15, no. 2, pp. 454–463, October 2020, doi: 10.1109/TCDS.2020.3028785.
  • H. M. Emara et al., "Hilbert transform and statistical analysis for channel selection and epileptic seizure prediction," Wireless Pers. Commun., vol. 116, pp. 3371–3395, January 2021, doi: 10.1007/s11277-020-07857-3.
  • E. Ergün, "Artificial Intelligence Approaches for Accurate Assessment of Insulator Cleanliness in High-Voltage Electrical Systems," Electr. Eng., pp. 1–16, August 2024, doi: 10.1007/s00202-024-02691-.
  • E. Yavuz, and Ö. Aydemir, “Classification of EEG based BCI signals imagined hand closing and opening”. In IEEE 40th International Conference on Telecommunications and Signal Processing (TSP), pp. 425-428, July 2017, doi: 10.1109/TSP.2017.8076020.
  • D. Svozil, V. Kvasnicka, and J. Pospichal, "Introduction to multi-layer feed-forward neural networks," Chemom. Intell. Lab. Syst., vol. 39, no. 1, pp. 43–62, November 1997, doi: 10.1016/S0169-7439(97)00061-0.
  • H. Choubey and A. Pandey, "A combination of statistical parameters for the detection of epilepsy and EEG classification using ANN and KNN classifier," Signal Image Video Process., vol. 15, no. 3, pp. 475–483, April 2021, doi: 10.1007/s11760-020-01767-4.
  • E. Ergün, “Deep learning-based multiclass classification for citrus anomaly detection in Agriculture”. Signal Image Video Process, vol. 18, pp. 8077–8088, July 2024. doi: 10.1007/s11760-024-03452-2.
  • E. Yavuz, and Ö. Aydemir, “Olfaction recognition by EEG analysis using wavelet transform features”. In IEEE International Symposium on Innovations in Intelligent Systems and Applications (INISTA), pp. 1-4, August 2016, doi: 10.1109/INISTA.2016.7571827.
  • G. Mary, S. Chitti, R. B. Vallabhaneni, and N. Renuka, "EEG Signal Classification Automation using Novel Modified Random Forest Approach," J. Sci. Ind. Res., vol. 82, no. 1, pp. 101–108, January 2023, doi: 10.56042/jsir.v82i1.70213.
  • A. Sakalle, P. Tomar, H. Bhardwaj, D. Acharya, and A. Bhardwaj, "A LSTM based deep learning network for recognizing emotions using wireless brainwave driven system," Expert Syst. Appl., vol. 173, no.1, p. 114516, July 2021, doi: 10.1016/j.eswa.2020.114516.
  • S. Bagherzadeh, K. Maghooli, A. Shalbaf, and A. Maghsoudi, "Emotion recognition using continuous wavelet transform and ensemble of convolutional neural networks through transfer learning from electroencephalogram signal," Front. Biomed. Technol., vol. 10, no. 1, pp. 47–56, January 2023, doi: 10.18502/fbt.v10i1.11512.
  • Y. Luo, C. Wu, and C. Lv, "Cascaded Convolutional Recurrent Neural Networks for EEG Emotion Recognition Based on Temporal–Frequency–Spatial Features," Appl. Sci., vol. 13, no. 11, p. 6761, June 2023, doi: 10.3390/app13116761.
  • J. Kim, J. Oh, and T. Y. Heo, "Acoustic classification of mosquitoes using convolutional neural networks combined with activity circadian rhythm information," Int. J. Interact. Multimedia Artif. Intell., vol. 7, no. 2, pp. 59-65, December 2021, doi: 10.9781/ijimai.2021.08.009.
  • V. Gupta, M. D. Chopda, and R. B. Pachori, "Cross-subject emotion recognition using flexible analytic wavelet transform from EEG signals," IEEE Sensors J., vol. 19, no. 6, pp. 2266–2274, March 2019, doi: 10.1109/JSEN.2018.2883497.
  • H. Mei and X. Xu, "EEG-based emotion classification using convolutional neural network," in Proc. Int. Conf. Security, Pattern Anal., Cybern. (SPAC), Shenzhen, China, December 2017, pp. 130–135, doi: 10.1109/SPAC.2017.8304301.
  • W. L. Zheng, J. Y. Zhu, and B. L. Lu, "Identifying stable patterns over time for emotion recognition from EEG," IEEE Trans. Affect. Comput., vol. 10, no. 3, pp. 417–429, June 2017, doi: 10.1109/TAFFC.2017.2712143.
  • M. Zangeneh Soroush, K. Maghooli, S. K. Setarehdan, and A. M. Nasrabadi, "A novel EEG-based approach to classify emotions through phase space dynamics," Signal Image Video Process., vol. 13, pp. 1149–1156, March 2019, doi: 10.1007/s11760-019-01455-y.
  • Y. H. Kwon, S. B. Shin, and S. D. Kim, "Electroencephalography based fusion two-dimensional (2D)-convolution neural networks (CNN) model for emotion recognition system," Sensors, vol. 18, no. 5, p. 1383, April 2018, doi: 10.3390/s18051383.

DEVELOPMENT OF A TERNARY LEVELS EMOTION CLASSIFICATION MODEL UTILIZING ELECTROENCEPHALOGRAPHY DATA SET

Yıl 2025, Cilt: 13 Sayı: 2, 607 - 623, 01.06.2025
https://doi.org/10.36306/konjes.1649691

Öz

Electroencephalogram (EEG)-based emotion recognition has gained increasing attention due to its potential in objectively assessing affective states. However, many existing studies rely on limited datasets and focus on binary classification or narrow feature sets, limiting the granularity and generalizability of their findings. To address these challenges, this study explores a ternary classification framework for both valence and arousal dimensions—dividing each into low, medium, and high levels—to capture a broader spectrum of emotional responses. EEG recordings from ten randomly selected participants in the DEAP dataset were used. Each 60-second EEG segment was divided into six non-overlapping windows of 10 seconds to preserve temporal stability and extract reliable features. The Hilbert Transform was applied to compute instantaneous amplitude and phase information, enabling the detection of subtle variations in emotional states. These features were then classified using a feed-forward neural network. The proposed approach achieved impressive classification accuracies of 99.13% for arousal and 99.50% for valence, demonstrating its effectiveness in recognizing multi-level emotional states. By moving beyond binary labels and leveraging time-frequency domain features, this study contributes to the development of more refined and responsive emotion recognition systems. These findings offer promising insights for real-world applications in affective computing, mental health monitoring, and adaptive human-computer interaction, where precise emotion modeling plays a critical role.

Kaynakça

  • M. G. Huddar, S. S. Sannakki, and V. S. Rajpurohit, "Attention-based multi-modal sentiment analysis and emotion detection in conversation using RNN," Int. J. Interact. Multimedia Artif. Intell, vol. 6, no. 6, June 2021, doi: 10.9781/ijimai.2020.07.004.
  • W. Rahmouni, G. Bachir, and M. Aillerie, "A new control strategy for harmonic reduction in photovoltaic inverters inspired by the autonomous nervous system," J. Electr. Eng., vol. 73, no. 5, pp. 310–317, September 2022, doi: 10.2478/jee-2022-0041.
  • L. George and H. Hadi, "User identification and verification from a pair of simultaneous EEG channels using transform-based features," Int. J. Interact. Multimedia Artif. Intell., vol. 5, no. 5,pp. 54-62, June 2019, doi: 10.9781/ijimai.2018.12.008.
  • J. Wang and M. Wang, "Review of the emotional feature extraction and classification using EEG signals," Cogn. Robot., vol. 1, pp. 29–40, April 2021, doi: 10.1016/j.cogr.2021.04.001.
  • W. L. Zheng and B. L. Lu, "Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks," IEEE Trans. Auton. Ment. Dev., vol. 7, no. 3, pp. 162–175, May 2015, doi: 10.1109/TAMD.2015.2431497.
  • R. N. Duan, J. Y. Zhu, and B. L. Lu, "Differential entropy feature for EEG-based emotion classification," in Proc. 6th Int. IEEE/EMBS Conf. Neural Eng. (NER), San Diego, CA, USA, November 2013, pp. 81–84, doi: 10.1109/NER.2013.6695889.
  • J. Li, Z. Zhang, and H. He, "Hierarchical convolutional neural networks for EEG-based emotion recognition," Cogn. Comput., vol. 10, pp. 368–380, April 2018, doi: 10.1007/s12559-017-9533-x.
  • M. A. Asghar et al., "EEG-based multi-modal emotion recognition using bag of deep features: An optimal feature selection approach," Sensors, vol. 19, no. 23, p. 5218, November 2019, doi: 10.3390/s19235218.
  • K. H. Cheah, H. Nisar, V. V. Yap, C. Y. Lee, and G. R. Sinha, "Optimizing residual networks and VGG for classification of EEG signals: Identifying ideal channels for emotion recognition," J. Healthcare Eng., vol. 1, pp. 1–10, March 2021, doi: 10.1155/2021/5599615.
  • G. Xiao et al., "4D attention-based neural network for EEG emotion recognition," Cogn. Neurodynamics, vol. 16, pp. 1–14, January 2022, doi: 10.1007/s11571-021-09751-5.
  • M. Jin, H. Chen, Z. Li, and J. Li, "EEG-based emotion recognition using graph convolutional network with learnable electrode relations," in Proc. 43rd Annu. Int. Conf. IEEE Eng. Med. Biol. Soc. (EMBC), Mexico, November 2021, pp. 5953–5957, doi: 10.1109/EMBC46164.2021.9630062.
  • S. Katsigiannis and N. Ramzan, "DREAMER: A database for emotion recognition through EEG and ECG signals from wireless low-cost off-the-shelf devices," IEEE J. Biomed. Health Inform., vol. 22, no. 1, pp. 98–107, March 2017, doi: 10.1109/JBHI.2017.2688239.
  • S. Koelstra et al., "DEAP: A database for emotion analysis using physiological signals," IEEE Trans. Affect. Comput., vol. 3, no. 1, pp. 18–31, December 2011, doi: 10.1109/T-AFFC.2011.15.
  • T. Song, W. Zheng, P. Song, and Z. Cui, "EEG emotion recognition using dynamical graph convolutional neural networks," IEEE Trans. Affect. Comput., vol. 11, no. 3, pp. 532–541, March 2018, doi: 10.1109/TAFFC.2018.2817622.
  • T. Zhang, X. Wang, X. Xu, and C. P. Chen, "GCB-Net: Graph convolutional broad network and its application in emotion recognition," IEEE Trans. Affect. Comput., vol. 13, no. 1, pp. 379–388, August 2019, doi: 10.1109/TAFFC.2019.2937768.
  • R. Li et al., "SSTD: A novel spatio-temporal demographic network for EEG-based emotion recognition," IEEE Trans. Comput. Soc. Syst., vol. 10, no. 1, pp. 376–387, January 2022, doi: 10.1109/TCSS.2022.3188891.
  • K. Lin, L. Zhang, J. Cai, J. Sun, W. Cui, and G. Liu, "DSE-Mixer: A pure multilayer perceptron network for emotion recognition from EEG feature maps," J. Neurosci. Methods, vol. 401, January 2024, doi: 10.1016/j.jneumeth.2023.110008.
  • Z. Gao, Y. Li, Y. Yang, X. Wang, N. Dong, and H. D. Chiang, "A GPSO-optimized convolutional neural networks for EEG-based emotion recognition," Neurocomputing, vol. 380, pp. 225–235, March 2020, doi: 10.1016/j.neucom.2019.10.096.
  • H. Chao and L. Dong, "Emotion recognition using three-dimensional feature and convolutional neural network from multichannel EEG signals," IEEE Sensors J., vol. 21, no. 2, pp. 2024–2034, September 2020, doi: 10.1109/JSEN.2020.3020828.
  • K. Martín-Chinea, J. Ortega, J. F. Gómez-González, E. Pereda, J. Toledo, & L. Acosta, “Effect of time windows in LSTM networks for EEG-based BCIs. Cognitive Neurodynamics”, vol. 17, no.2, 385-398, April 2023. doi: 10.1007/s11571-022-09832-z.
  • S. Zhou, B. Chen, Y. Zhang, H. Liu, Y. Xiao, and X. Pan, "A feature extraction method based on feature fusion and its application in the text-driven failure diagnosis field," Int. J. Interact. Multimedia Artif. Intell., vol. 6, no. 4, pp. 121-130, December 2020, doi: 10.9781/ijimai.2020.11.006.
  • E. Ergün and O. Aydemir, "A Hybrid BCI Using Singular Value Decomposition Values of the Fast Walsh–Hadamard Transform Coefficients," IEEE Trans. Cogn. Dev. Syst., vol. 15, no. 2, pp. 454–463, October 2020, doi: 10.1109/TCDS.2020.3028785.
  • H. M. Emara et al., "Hilbert transform and statistical analysis for channel selection and epileptic seizure prediction," Wireless Pers. Commun., vol. 116, pp. 3371–3395, January 2021, doi: 10.1007/s11277-020-07857-3.
  • E. Ergün, "Artificial Intelligence Approaches for Accurate Assessment of Insulator Cleanliness in High-Voltage Electrical Systems," Electr. Eng., pp. 1–16, August 2024, doi: 10.1007/s00202-024-02691-.
  • E. Yavuz, and Ö. Aydemir, “Classification of EEG based BCI signals imagined hand closing and opening”. In IEEE 40th International Conference on Telecommunications and Signal Processing (TSP), pp. 425-428, July 2017, doi: 10.1109/TSP.2017.8076020.
  • D. Svozil, V. Kvasnicka, and J. Pospichal, "Introduction to multi-layer feed-forward neural networks," Chemom. Intell. Lab. Syst., vol. 39, no. 1, pp. 43–62, November 1997, doi: 10.1016/S0169-7439(97)00061-0.
  • H. Choubey and A. Pandey, "A combination of statistical parameters for the detection of epilepsy and EEG classification using ANN and KNN classifier," Signal Image Video Process., vol. 15, no. 3, pp. 475–483, April 2021, doi: 10.1007/s11760-020-01767-4.
  • E. Ergün, “Deep learning-based multiclass classification for citrus anomaly detection in Agriculture”. Signal Image Video Process, vol. 18, pp. 8077–8088, July 2024. doi: 10.1007/s11760-024-03452-2.
  • E. Yavuz, and Ö. Aydemir, “Olfaction recognition by EEG analysis using wavelet transform features”. In IEEE International Symposium on Innovations in Intelligent Systems and Applications (INISTA), pp. 1-4, August 2016, doi: 10.1109/INISTA.2016.7571827.
  • G. Mary, S. Chitti, R. B. Vallabhaneni, and N. Renuka, "EEG Signal Classification Automation using Novel Modified Random Forest Approach," J. Sci. Ind. Res., vol. 82, no. 1, pp. 101–108, January 2023, doi: 10.56042/jsir.v82i1.70213.
  • A. Sakalle, P. Tomar, H. Bhardwaj, D. Acharya, and A. Bhardwaj, "A LSTM based deep learning network for recognizing emotions using wireless brainwave driven system," Expert Syst. Appl., vol. 173, no.1, p. 114516, July 2021, doi: 10.1016/j.eswa.2020.114516.
  • S. Bagherzadeh, K. Maghooli, A. Shalbaf, and A. Maghsoudi, "Emotion recognition using continuous wavelet transform and ensemble of convolutional neural networks through transfer learning from electroencephalogram signal," Front. Biomed. Technol., vol. 10, no. 1, pp. 47–56, January 2023, doi: 10.18502/fbt.v10i1.11512.
  • Y. Luo, C. Wu, and C. Lv, "Cascaded Convolutional Recurrent Neural Networks for EEG Emotion Recognition Based on Temporal–Frequency–Spatial Features," Appl. Sci., vol. 13, no. 11, p. 6761, June 2023, doi: 10.3390/app13116761.
  • J. Kim, J. Oh, and T. Y. Heo, "Acoustic classification of mosquitoes using convolutional neural networks combined with activity circadian rhythm information," Int. J. Interact. Multimedia Artif. Intell., vol. 7, no. 2, pp. 59-65, December 2021, doi: 10.9781/ijimai.2021.08.009.
  • V. Gupta, M. D. Chopda, and R. B. Pachori, "Cross-subject emotion recognition using flexible analytic wavelet transform from EEG signals," IEEE Sensors J., vol. 19, no. 6, pp. 2266–2274, March 2019, doi: 10.1109/JSEN.2018.2883497.
  • H. Mei and X. Xu, "EEG-based emotion classification using convolutional neural network," in Proc. Int. Conf. Security, Pattern Anal., Cybern. (SPAC), Shenzhen, China, December 2017, pp. 130–135, doi: 10.1109/SPAC.2017.8304301.
  • W. L. Zheng, J. Y. Zhu, and B. L. Lu, "Identifying stable patterns over time for emotion recognition from EEG," IEEE Trans. Affect. Comput., vol. 10, no. 3, pp. 417–429, June 2017, doi: 10.1109/TAFFC.2017.2712143.
  • M. Zangeneh Soroush, K. Maghooli, S. K. Setarehdan, and A. M. Nasrabadi, "A novel EEG-based approach to classify emotions through phase space dynamics," Signal Image Video Process., vol. 13, pp. 1149–1156, March 2019, doi: 10.1007/s11760-019-01455-y.
  • Y. H. Kwon, S. B. Shin, and S. D. Kim, "Electroencephalography based fusion two-dimensional (2D)-convolution neural networks (CNN) model for emotion recognition system," Sensors, vol. 18, no. 5, p. 1383, April 2018, doi: 10.3390/s18051383.
Toplam 39 adet kaynakça vardır.

Ayrıntılar

Birincil Dil İngilizce
Konular Biyomedikal Mühendisliği (Diğer), Elektrik Mühendisliği (Diğer), Elektronik, Sensörler ve Dijital Donanım (Diğer)
Bölüm Araştırma Makalesi
Yazarlar

Hatice Okumuş 0000-0003-4074-2503

Ebru Ergün 0000-0002-5371-7238

Yayımlanma Tarihi 1 Haziran 2025
Gönderilme Tarihi 2 Mart 2025
Kabul Tarihi 23 Mayıs 2025
Yayımlandığı Sayı Yıl 2025 Cilt: 13 Sayı: 2

Kaynak Göster

IEEE H. Okumuş ve E. Ergün, “DEVELOPMENT OF A TERNARY LEVELS EMOTION CLASSIFICATION MODEL UTILIZING ELECTROENCEPHALOGRAPHY DATA SET”, KONJES, c. 13, sy. 2, ss. 607–623, 2025, doi: 10.36306/konjes.1649691.