Araştırma Makalesi
BibTex RIS Kaynak Göster
Yıl 2025, Cilt: 11 Sayı: 1, 55 - 73, 29.06.2025
https://doi.org/10.51477/mejs.1605943

Öz

Kaynakça

  • Kumar, A., Sidhu, J., Goyal, A., Tsao, J. W., “Alzheimer disease”, StatPearls, StatPearls Publishing, 2023.
  • T.C. Sağlık Bakanlığı, “Dünya Alzheimer Günü”, 21 Eylül 2024 [Online]. Available: https://hsgm.saglik.gov.tr/tr/haberler-16/21-eylul-2024-dunya-alzheimer-gunu.html
  • Porsteinsson, A. P., Isaacson, R. S., Knox, S., Sabbagh, M. N., Rubino, I., “Diagnosis of early Alzheimer’s disease: clinical practice in 2021”, The journal of prevention of Alzheimer’s disease, 8, 371-386, 2021.
  • Ding, Y., Zhou, K., Bi, W., “Feature selection based on hybridization of genetic algorithm and competitive swarm optimizer”, Soft Computing, 24, 11663–11672, 2020.
  • Anirudha, R. C., Kannan, R., Patil, N., “Genetic algorithm-based wrapper feature selection on hybrid prediction model for analysis of high-dimensional data”, Proceeding of 9th International Conference on Industrial and Information Systems (ICIIS), IEEE, pp. 1–6, 2014.
  • Mirzaei, S., et al., “Two-stage feature selection of voice parameters for early Alzheimer’s disease prediction”, IRBM, 39(6), 430–435, 2018.
  • Neelaveni, J., Devasana, M. S. G., “Alzheimer disease prediction using machine learning algorithms”, Proceeding of 6th International Conference on Advanced Computing and Communication Systems (ICACCS), IEEE, pp. 101–104, 2020.
  • Saputra, R. A., et al., “Detecting Alzheimer’s disease by the decision tree methods based on particle swarm optimization”, Journal of Physics: Conference Series, IOP Publishing, 012025, 2020.
  • Ramaswamy, R., Kandhasamy, P., Palaniswamy, S., “Feature selection for Alzheimer’s gene expression data using modified binary particle swarm optimization”, IETE Journal of Research, 69(1), 9–20, 2023.
  • Divya, R., Shantha Selva Kumari, R., Alzheimer’s Disease Neuroimaging Initiative, “Genetic algorithm with logistic regression feature selection for Alzheimer’s disease classification”, Neural Computing and Applications, 33(14), 8435–8444, 2021.
  • Buyrukoğlu, S., “Early detection of Alzheimer’s disease using data mining: Comparison of ensemble feature selection approaches”, Konya Journal of Engineering Sciences, 9(1), 50–61, 2021.
  • Noroozi, Z., Orooji, A., Erfannia, L., “Analyzing the impact of feature selection methods on machine learning algorithms for heart disease prediction”, Scientific Reports, 13(1), 22588, 2023.
  • Hassouneh, A., et al., “Feature importance analysis and machine learning for Alzheimer’s disease early detection: Feature fusion of the hippocampus, entorhinal cortex, and standardized uptake value ratio”, Digital Biomarkers, 8(1), 59–74, 2024.
  • Kaggle, “Alzheimer’s Disease Dataset” [Online]. Available: https://www.kaggle.com/datasets/rabieelkharoua/alzheimers-disease-dataset?select=alzheimers_disease_data.csv
  • Soofi, Aized Amin, Awan, Arshad, “Classification techniques in machine learning: applications and issues”, Journal of Basic & Applied Sciences, 13, 459–465, 2017.
  • Sahoo, G., Kumar, Yugal, “Analysis of parametric & non parametric classifiers for classification technique using WEKA”, International Journal of Information Technology and Computer Science (IJITCS), 4(7), 43, 2012.
  • Surya, P. P., Seetha, L. V., Subbulakshmi, B., “Analysis of user emotions and opinion using multinomial naive Bayes classifier”, Proceeding of 3rd International Conference on Electronics, Communication and Aerospace Technology (ICECA), IEEE, pp. 410–415, 2019.
  • Abdiansah, A., Wardoyo, R., “Time complexity analysis of support vector machines (SVM) in LibSVM”, International Journal of Computer Applications, 128(3), 28–34, 2015.
  • Bharati, S., Rahman, M. A., Podder, P., “Breast cancer prediction applying different classification algorithms with comparative analysis using WEKA”, Proceeding of 4th International Conference on Electrical Engineering and Information & Communication Technology (iCEEiCT), IEEE, pp. 581–584, 2018.
  • Alaoui, S. S., Farhaoui, Y., Aksasse, B., “Classification algorithms in data mining”, International Journal of Tomography & Simulations, 31(4), 34–44, 2018.
  • Stupka, R. J., et al., “Exploring attribute selection and classification methods for predicting heart disease”, Proceeding of International Conference on Computational Science and Computational Intelligence (CSCI), IEEE, pp. 701–706, 2022.
  • Mahmood, D. Y., Hussein, M. A., “Intrusion detection system based on K-star classifier and feature set reduction”, IOSR Journal of Computer Engineering (IOSR-JCE), 15(5), 107–112, 2013.
  • Dou, Y., Meng, W., “Comparative analysis of WEKA-based classification algorithms on medical diagnosis datasets”, Technology and Health Care, 31(S1), 397–408, 2023.
  • Khosravi, K., et al., “Predictive modeling of selected trace elements in groundwater using hybrid algorithms of iterative classifier optimizer”, Journal of Contaminant Hydrology, 242, 103849, 2021.
  • Thilagaraj, T., Sengottaiyan, N., “Implementation of classification algorithms in educational data using WEKA tool”, International Journal of Computer Sciences and Engineering, 7(5), 1254–1257, 2019.
  • WEKA SourceForge, “AttributeSelectedClassifier” [Online]. Available: https://weka.sourceforge.io/doc.dev/weka/classifiers/meta/AttributeSelectedClassifier.html
  • WEKA SourceForge, “ClassificationViaRegression” [Online]. Available: https://weka.sourceforge.io/doc.dev/weka/classifiers/meta/ClassificationViaRegression.html
  • WEKA SourceForge, “CVParameterSelection” [Online]. Available: https://weka.sourceforge.io/doc.dev/weka/classifiers/meta/CVParameterSelection.html
  • WEKA SourceForge, “FilteredClassifier” [Online]. Available: https://weka.sourceforge.io/doc.dev/weka/classifiers/meta/FilteredClassifier.html
  • WEKA SourceForge, “LogitBoost” [Online]. Available: https://weka.sourceforge.io/doc.dev/weka/classifiers/meta/LogitBoost.html
  • WEKA SourceForge, “MultiClassClassifier” [Online]. Available: https://weka.sourceforge.io/doc.dev/weka/classifiers/meta/MultiClassClassifier.html
  • WEKA SourceForge, “MultiClassClassifierUpdateable” [Online]. Available: https://weka.sourceforge.io/doc.dev/weka/classifiers/meta/MultiClassClassifierUpdateable.html
  • WEKA SourceForge, “MultiScheme” [Online]. Available: https://weka.sourceforge.io/doc.dev/weka/classifiers/meta/MultiScheme.html
  • Niranjan, A., et al., “ERCR TV: Ensemble of random committee and random tree for efficient anomaly classification using voting”, Proceeding of 3rd International Conference for Convergence in Technology (I2CT), IEEE, pp. 1–5, 2018.
  • Asaju, L. B., et al., “Intrusion detection system on a computer network using an ensemble of randomizable filtered classifier, K-nearest neighbor algorithm”, FUW Trends in Science & Technology Journal, 2(1), 550–553, 2017.
  • Lasota, T., Łuczak, T., Trawiński, B., “Investigation of random subspace and random forest methods applied to property valuation data”, in: Computational Collective Intelligence. Technologies and Applications (Ed. N.T. Nguyen et al.), Berlin, Springer Berlin Heidelberg, 2011, ch.3, pp. 142–151.
  • Baskin, I. I., Marcou, G., Horvath, D., Varnek, A., “Cross-Validation and the Variable Selection Bias”, in: Tutorials in Chemoinformatics (Ed. A. Varnek), Chichester, John Wiley & Sons, 2017, ch.10, pp. 163–173.
  • WEKA SourceForge, “Vote” [Online]. Available: https://weka.sourceforge.io/doc.dev/weka/classifiers/meta/Vote.html
  • Kargar, K., Safari, M. J. S., Khosravi, K., “Weighted instances handler wrapper and rotation forest-based hybrid algorithms for sediment transport modeling”, Journal of Hydrology, 598, 126452, 2021.
  • WEKA SourceForge, “InputMappedClassifier” [Online]. Available: https://weka.sourceforge.io/doc.dev/weka/classifiers/misc/InputMappedClassifier.html
  • Jankovic, R., “Classifying cultural heritage images by using decision tree classifiers in WEKA”, Proceeding of 1st International Workshop on Visual Pattern Extraction and Recognition for Cultural Heritage Understanding Co-located with 15th Italian Research Conference on Digital Libraries (IRCDL 2019), Pisa, Italy, pp. 119–127, 2019.
  • WEKA SourceForge, “JRip” [Online]. Available: https://weka.sourceforge.io/doc.dev/weka/classifiers/rules/JRip.html
  • Nasa, C., Suman, S., “Evaluation of different classification techniques for web data”, International Journal of Computer Applications, 52(9), 34–40, 2012.
  • Aher, S. B., Lobo, L. M. R. J., “Data mining in educational system using WEKA”, Proceeding of International Conference on Emerging Technology Trends (ICETT), Foundation of Computer Science, pp. 20–25, 2011.
  • Kawelah, W., Abdala, A., “A comparative study on machine learning tools using WEKA and RapidMiner with classifier algorithms C4.5 and decision stump for network intrusion detection”, European Academic Research, 7(2), 852–861, 2019.
  • Lavanya, K., et al., “Handwritten digit recognition using Hoeffding tree, decision tree and random forests—a comparative approach”, Proceeding of International Conference on Computational Intelligence in Data Science (ICCIDS), IEEE, pp. 1–6, 2017.
  • Rajalakshmi, A., Vinodhini, R., Bibi, K. F., “Data discretization technique using WEKA tool”, International Journal of Science, Engineering and Computer Technology, 6(8), 293, 2016.
  • Kumar, V., Minz, S., “Feature selection”, SmartCR, 4(3), 211–229, 2014.
  • Velmurugan, T., Anuradha, C., “Performance evaluation of feature selection algorithms in educational data mining”, Performance Evaluation, 5(2), 92–106, 2016.
  • Rostami, M., Berahmand, K., Forouzanadeh, S., “A novel community detection-based genetic algorithm for feature selection”, Journal of Big Data, 8(1), 2, 2021.
  • Xue, B., Zhang, M., Browne, W. N., “Particle swarm optimization for feature selection in classification: A multi-objective approach”, IEEE Transactions on Cybernetics, 43(6), 1656–1671, 2012.
  • Suñé, F. G., “Machine learning techniques for Alzheimer prediction”, Master’s thesis, Universitat de Barcelona, 2025.
  • Patel, P., Patel, K., Thakkar, K., Goel, P., Valani, P., Vadhavana, V., “Alzheimer’s disease detection using various machine learning algorithms”, Proceeding of 6th International Conference on Mobile Computing and Sustainable Informatics (ICMCSI-2025), IEEE, pp. 1048–1054, 2025.
  • Sendhil, R., Patibandha, N., Shah, P., Mandal, A., “Comparative Evaluation of Machine Learning Techniques for Early Alzheimer’s Detection: Challenges and Insights”, Proceeding of International Conference on Multi-Agent Systems for Collaborative Intelligence (ICMSCI), IEEE, pp. 1050–1058, 2025.
  • Jevin, J. A., Umamageswari, A., “Alzheimer’s Disease Prediction using an Artificial Butterfly Optimizer with a Two-Layer CNN Approach”, Proceeding of 4th International Conference on Mobile Networks and Wireless Communications (ICMNWC), IEEE, pp. 1–6, December 2024.
  • Airlangga, G., “Advancing Alzheimer’s Diagnosis: A Comparative Analysis of Deep Learning Architectures on Multidimensional Health Data”, Jurnal Informatika Ekonomi Bisnis, pp. 810–814, 2024.

COMPARATIVE FEATURE SELECTION APPROACHES FOR ALZHEIMER'S DISEASE USING GENETIC ALGORITHMS AND PARTICLE SWARM OPTIMIZATION

Yıl 2025, Cilt: 11 Sayı: 1, 55 - 73, 29.06.2025
https://doi.org/10.51477/mejs.1605943

Öz

Alzheimer’s disease (AD) is the most prevalent form of dementia, significantly impairing cognitive abilities such as memory and judgment. The number of dementia cases is expected to rise dramatically in the coming decades, with Alzheimer's disease accounting for 60-80% of these cases. Early detection is crucial for improving patient outcomes, yet diagnosing Alzheimer’s at its early stages remains challenging due to various clinical and perceptual obstacles. This study addresses whether Alzheimer’s can be detected in advance and the methods that can be used for early diagnosis. Using an Alzheimer's disease dataset sourced from Kaggle including 2,150 samples with 32 independent and 1 dependent variables, various classification algorithms were applied to assess performance. Feature selection techniques, including both classical and metaheuristic methods (Genetic Algorithm and Particle Swarm Optimization), were then applied to the dataset. These methods helped reduce the dataset's dimensionality while maintaining high diagnostic performance. The results showed that both metaheuristic algorithms selected 14 variables, producing the same high performance rate of 95.57% compared to the initial 32 variables. The findings suggest that Alzheimer's disease can be detected more efficiently with fewer variables, reducing analysis time and increasing diagnostic speed. Metaheuristic algorithms, particularly Particle Swarm Optimization, proved to be the most effective, enhancing the performance of 33 classifiers, while the Genetic Algorithm improved the performance of 28 classifiers. This study demonstrates that Alzheimer's can be detected with fewer variables, in less time, and with a higher accuracy rate. As a result, improved patient outcomes through reduced computational complexity and enhanced diagnostic efficiency can potentially be achieved.

Etik Beyan

The data is sourced from an open-access database, so there is no need for an ethics committee’s evaluation.

Kaynakça

  • Kumar, A., Sidhu, J., Goyal, A., Tsao, J. W., “Alzheimer disease”, StatPearls, StatPearls Publishing, 2023.
  • T.C. Sağlık Bakanlığı, “Dünya Alzheimer Günü”, 21 Eylül 2024 [Online]. Available: https://hsgm.saglik.gov.tr/tr/haberler-16/21-eylul-2024-dunya-alzheimer-gunu.html
  • Porsteinsson, A. P., Isaacson, R. S., Knox, S., Sabbagh, M. N., Rubino, I., “Diagnosis of early Alzheimer’s disease: clinical practice in 2021”, The journal of prevention of Alzheimer’s disease, 8, 371-386, 2021.
  • Ding, Y., Zhou, K., Bi, W., “Feature selection based on hybridization of genetic algorithm and competitive swarm optimizer”, Soft Computing, 24, 11663–11672, 2020.
  • Anirudha, R. C., Kannan, R., Patil, N., “Genetic algorithm-based wrapper feature selection on hybrid prediction model for analysis of high-dimensional data”, Proceeding of 9th International Conference on Industrial and Information Systems (ICIIS), IEEE, pp. 1–6, 2014.
  • Mirzaei, S., et al., “Two-stage feature selection of voice parameters for early Alzheimer’s disease prediction”, IRBM, 39(6), 430–435, 2018.
  • Neelaveni, J., Devasana, M. S. G., “Alzheimer disease prediction using machine learning algorithms”, Proceeding of 6th International Conference on Advanced Computing and Communication Systems (ICACCS), IEEE, pp. 101–104, 2020.
  • Saputra, R. A., et al., “Detecting Alzheimer’s disease by the decision tree methods based on particle swarm optimization”, Journal of Physics: Conference Series, IOP Publishing, 012025, 2020.
  • Ramaswamy, R., Kandhasamy, P., Palaniswamy, S., “Feature selection for Alzheimer’s gene expression data using modified binary particle swarm optimization”, IETE Journal of Research, 69(1), 9–20, 2023.
  • Divya, R., Shantha Selva Kumari, R., Alzheimer’s Disease Neuroimaging Initiative, “Genetic algorithm with logistic regression feature selection for Alzheimer’s disease classification”, Neural Computing and Applications, 33(14), 8435–8444, 2021.
  • Buyrukoğlu, S., “Early detection of Alzheimer’s disease using data mining: Comparison of ensemble feature selection approaches”, Konya Journal of Engineering Sciences, 9(1), 50–61, 2021.
  • Noroozi, Z., Orooji, A., Erfannia, L., “Analyzing the impact of feature selection methods on machine learning algorithms for heart disease prediction”, Scientific Reports, 13(1), 22588, 2023.
  • Hassouneh, A., et al., “Feature importance analysis and machine learning for Alzheimer’s disease early detection: Feature fusion of the hippocampus, entorhinal cortex, and standardized uptake value ratio”, Digital Biomarkers, 8(1), 59–74, 2024.
  • Kaggle, “Alzheimer’s Disease Dataset” [Online]. Available: https://www.kaggle.com/datasets/rabieelkharoua/alzheimers-disease-dataset?select=alzheimers_disease_data.csv
  • Soofi, Aized Amin, Awan, Arshad, “Classification techniques in machine learning: applications and issues”, Journal of Basic & Applied Sciences, 13, 459–465, 2017.
  • Sahoo, G., Kumar, Yugal, “Analysis of parametric & non parametric classifiers for classification technique using WEKA”, International Journal of Information Technology and Computer Science (IJITCS), 4(7), 43, 2012.
  • Surya, P. P., Seetha, L. V., Subbulakshmi, B., “Analysis of user emotions and opinion using multinomial naive Bayes classifier”, Proceeding of 3rd International Conference on Electronics, Communication and Aerospace Technology (ICECA), IEEE, pp. 410–415, 2019.
  • Abdiansah, A., Wardoyo, R., “Time complexity analysis of support vector machines (SVM) in LibSVM”, International Journal of Computer Applications, 128(3), 28–34, 2015.
  • Bharati, S., Rahman, M. A., Podder, P., “Breast cancer prediction applying different classification algorithms with comparative analysis using WEKA”, Proceeding of 4th International Conference on Electrical Engineering and Information & Communication Technology (iCEEiCT), IEEE, pp. 581–584, 2018.
  • Alaoui, S. S., Farhaoui, Y., Aksasse, B., “Classification algorithms in data mining”, International Journal of Tomography & Simulations, 31(4), 34–44, 2018.
  • Stupka, R. J., et al., “Exploring attribute selection and classification methods for predicting heart disease”, Proceeding of International Conference on Computational Science and Computational Intelligence (CSCI), IEEE, pp. 701–706, 2022.
  • Mahmood, D. Y., Hussein, M. A., “Intrusion detection system based on K-star classifier and feature set reduction”, IOSR Journal of Computer Engineering (IOSR-JCE), 15(5), 107–112, 2013.
  • Dou, Y., Meng, W., “Comparative analysis of WEKA-based classification algorithms on medical diagnosis datasets”, Technology and Health Care, 31(S1), 397–408, 2023.
  • Khosravi, K., et al., “Predictive modeling of selected trace elements in groundwater using hybrid algorithms of iterative classifier optimizer”, Journal of Contaminant Hydrology, 242, 103849, 2021.
  • Thilagaraj, T., Sengottaiyan, N., “Implementation of classification algorithms in educational data using WEKA tool”, International Journal of Computer Sciences and Engineering, 7(5), 1254–1257, 2019.
  • WEKA SourceForge, “AttributeSelectedClassifier” [Online]. Available: https://weka.sourceforge.io/doc.dev/weka/classifiers/meta/AttributeSelectedClassifier.html
  • WEKA SourceForge, “ClassificationViaRegression” [Online]. Available: https://weka.sourceforge.io/doc.dev/weka/classifiers/meta/ClassificationViaRegression.html
  • WEKA SourceForge, “CVParameterSelection” [Online]. Available: https://weka.sourceforge.io/doc.dev/weka/classifiers/meta/CVParameterSelection.html
  • WEKA SourceForge, “FilteredClassifier” [Online]. Available: https://weka.sourceforge.io/doc.dev/weka/classifiers/meta/FilteredClassifier.html
  • WEKA SourceForge, “LogitBoost” [Online]. Available: https://weka.sourceforge.io/doc.dev/weka/classifiers/meta/LogitBoost.html
  • WEKA SourceForge, “MultiClassClassifier” [Online]. Available: https://weka.sourceforge.io/doc.dev/weka/classifiers/meta/MultiClassClassifier.html
  • WEKA SourceForge, “MultiClassClassifierUpdateable” [Online]. Available: https://weka.sourceforge.io/doc.dev/weka/classifiers/meta/MultiClassClassifierUpdateable.html
  • WEKA SourceForge, “MultiScheme” [Online]. Available: https://weka.sourceforge.io/doc.dev/weka/classifiers/meta/MultiScheme.html
  • Niranjan, A., et al., “ERCR TV: Ensemble of random committee and random tree for efficient anomaly classification using voting”, Proceeding of 3rd International Conference for Convergence in Technology (I2CT), IEEE, pp. 1–5, 2018.
  • Asaju, L. B., et al., “Intrusion detection system on a computer network using an ensemble of randomizable filtered classifier, K-nearest neighbor algorithm”, FUW Trends in Science & Technology Journal, 2(1), 550–553, 2017.
  • Lasota, T., Łuczak, T., Trawiński, B., “Investigation of random subspace and random forest methods applied to property valuation data”, in: Computational Collective Intelligence. Technologies and Applications (Ed. N.T. Nguyen et al.), Berlin, Springer Berlin Heidelberg, 2011, ch.3, pp. 142–151.
  • Baskin, I. I., Marcou, G., Horvath, D., Varnek, A., “Cross-Validation and the Variable Selection Bias”, in: Tutorials in Chemoinformatics (Ed. A. Varnek), Chichester, John Wiley & Sons, 2017, ch.10, pp. 163–173.
  • WEKA SourceForge, “Vote” [Online]. Available: https://weka.sourceforge.io/doc.dev/weka/classifiers/meta/Vote.html
  • Kargar, K., Safari, M. J. S., Khosravi, K., “Weighted instances handler wrapper and rotation forest-based hybrid algorithms for sediment transport modeling”, Journal of Hydrology, 598, 126452, 2021.
  • WEKA SourceForge, “InputMappedClassifier” [Online]. Available: https://weka.sourceforge.io/doc.dev/weka/classifiers/misc/InputMappedClassifier.html
  • Jankovic, R., “Classifying cultural heritage images by using decision tree classifiers in WEKA”, Proceeding of 1st International Workshop on Visual Pattern Extraction and Recognition for Cultural Heritage Understanding Co-located with 15th Italian Research Conference on Digital Libraries (IRCDL 2019), Pisa, Italy, pp. 119–127, 2019.
  • WEKA SourceForge, “JRip” [Online]. Available: https://weka.sourceforge.io/doc.dev/weka/classifiers/rules/JRip.html
  • Nasa, C., Suman, S., “Evaluation of different classification techniques for web data”, International Journal of Computer Applications, 52(9), 34–40, 2012.
  • Aher, S. B., Lobo, L. M. R. J., “Data mining in educational system using WEKA”, Proceeding of International Conference on Emerging Technology Trends (ICETT), Foundation of Computer Science, pp. 20–25, 2011.
  • Kawelah, W., Abdala, A., “A comparative study on machine learning tools using WEKA and RapidMiner with classifier algorithms C4.5 and decision stump for network intrusion detection”, European Academic Research, 7(2), 852–861, 2019.
  • Lavanya, K., et al., “Handwritten digit recognition using Hoeffding tree, decision tree and random forests—a comparative approach”, Proceeding of International Conference on Computational Intelligence in Data Science (ICCIDS), IEEE, pp. 1–6, 2017.
  • Rajalakshmi, A., Vinodhini, R., Bibi, K. F., “Data discretization technique using WEKA tool”, International Journal of Science, Engineering and Computer Technology, 6(8), 293, 2016.
  • Kumar, V., Minz, S., “Feature selection”, SmartCR, 4(3), 211–229, 2014.
  • Velmurugan, T., Anuradha, C., “Performance evaluation of feature selection algorithms in educational data mining”, Performance Evaluation, 5(2), 92–106, 2016.
  • Rostami, M., Berahmand, K., Forouzanadeh, S., “A novel community detection-based genetic algorithm for feature selection”, Journal of Big Data, 8(1), 2, 2021.
  • Xue, B., Zhang, M., Browne, W. N., “Particle swarm optimization for feature selection in classification: A multi-objective approach”, IEEE Transactions on Cybernetics, 43(6), 1656–1671, 2012.
  • Suñé, F. G., “Machine learning techniques for Alzheimer prediction”, Master’s thesis, Universitat de Barcelona, 2025.
  • Patel, P., Patel, K., Thakkar, K., Goel, P., Valani, P., Vadhavana, V., “Alzheimer’s disease detection using various machine learning algorithms”, Proceeding of 6th International Conference on Mobile Computing and Sustainable Informatics (ICMCSI-2025), IEEE, pp. 1048–1054, 2025.
  • Sendhil, R., Patibandha, N., Shah, P., Mandal, A., “Comparative Evaluation of Machine Learning Techniques for Early Alzheimer’s Detection: Challenges and Insights”, Proceeding of International Conference on Multi-Agent Systems for Collaborative Intelligence (ICMSCI), IEEE, pp. 1050–1058, 2025.
  • Jevin, J. A., Umamageswari, A., “Alzheimer’s Disease Prediction using an Artificial Butterfly Optimizer with a Two-Layer CNN Approach”, Proceeding of 4th International Conference on Mobile Networks and Wireless Communications (ICMNWC), IEEE, pp. 1–6, December 2024.
  • Airlangga, G., “Advancing Alzheimer’s Diagnosis: A Comparative Analysis of Deep Learning Architectures on Multidimensional Health Data”, Jurnal Informatika Ekonomi Bisnis, pp. 810–814, 2024.
Toplam 56 adet kaynakça vardır.

Ayrıntılar

Birincil Dil İngilizce
Konular Endüstri Mühendisliği
Bölüm Makale
Yazarlar

Simay Türküsay 0009-0007-8499-3888

Murat Oturakçı 0000-0001-5946-3964

Esra Ekinci 0000-0003-2609-7763

Deniz Türsel Eliiyi 0000-0001-7693-3980

Erken Görünüm Tarihi 26 Haziran 2025
Yayımlanma Tarihi 29 Haziran 2025
Gönderilme Tarihi 23 Aralık 2024
Kabul Tarihi 21 Nisan 2025
Yayımlandığı Sayı Yıl 2025 Cilt: 11 Sayı: 1

Kaynak Göster

IEEE S. Türküsay, M. Oturakçı, E. Ekinci, ve D. Türsel Eliiyi, “COMPARATIVE FEATURE SELECTION APPROACHES FOR ALZHEIMER’S DISEASE USING GENETIC ALGORITHMS AND PARTICLE SWARM OPTIMIZATION”, MEJS, c. 11, sy. 1, ss. 55–73, 2025, doi: 10.51477/mejs.1605943.

Creative Commons License

This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License

TRDizinlogo_live-e1586763957746.png   ici2.png     scholar_logo_64dp.png    CenterLogo.png     crossref-logo-landscape-200.png  logo.png         logo1.jpg   DRJI_Logo.jpg  17826265674769  logo.png