Araştırma Makalesi
BibTex RIS Kaynak Göster

PISA 2015 fen okuryazarlığı maddelerinin değişen madde fonksiyonunun gizil sınıf yaklaşımı ile incelenmesi

Yıl 2025, Cilt: 8 Sayı: 1, 178 - 204, 31.05.2025
https://doi.org/10.33400/kuje.1158799

Öz

Bu çalışmanın amacı, Uluslararası Öğrenci Değerlendirme Programı (PISA) 2015 uygulamasının fen okuryazarlığı maddeleri arasından belirlenen 11 maddenin, gizil sınıf yöntemi kullanılarak Değişen Madde Fonksiyonu (DMF) açısından incelenmesidir. Yapılan ilk incelemelerde, yanıt örüntüsü, 506 kişilik Türkiye çalışma grubu için Madde Tepki Kuramı (MTK) varsayımları olan tek boyutluluk, monotonluk ve yerel bağımsızlık varsayımlarını karşılamıştır. MTK ile yapılan tahminler sonucunda maddelerin 2 Gizil Sınıflı Karma 2PLM (Mixed IRT 2PLM with two latent classes) için uygun olduğu sonucuna varılmıştır. Analizler 2 Gizil Sınıflı Karma 2PLM temel alınarak yapılmış ve ankete katılanların yaklaşık %91'inin birinci, %9'unun ise ikinci gizli sınıfta yer aldığı belirlenmiştir. İki gizil sınıflı Karma 2 PLM temel alınarak yapılan DMF analizinde Madde 4’ün hem tek biçimli olmayan (TBO) hem de tek biçimli olan (TB) DMF gösterdiği belirlenmiştir. 5, 6, 9 ve 11 numaralı maddelerin sadece tek biçimli olan (TB) DIF gösterdiği sonucuna varılmıştır. 5, 6, 9 ve 11. maddelerin iki gizil sınıfta farklı fonksiyonlaştıkları görülmüştür. DMF içeren maddelerin gizil sınıfları açıklama gücünü incelemek amacıyla literatür taramasında DMF kaynağı olarak raporlanmış değişkenler kovaryant olarak ayrı ayrı modele dahil edilmiştir. Cinsiyet, bilgi ve teknoloji iletişim kaynakları, fen öğretiminden zevk alma ve okula aidiyet duygusu gibi kovaryant değişkenler modele ayrı ayrı eklendiğinde sınıflandırma doğruluğu yüzdesini arttırmıştır. Dolayısıyla bu çalışmada kovaryant değişkenin gizil değişkenin bir yordayıcısı ya da önsel bilgi kaynağı olarak görev yaptığı sonucuna ulaşılmıştır. Bu durum bireylerin fen okuryazarlığı maddelerine ilişkin tepkilerinde bu değişkenlerden farklı bir boyutun etkisi ile gizil sınıflara ayrıldıklarını göstermektedir. Yine bu çalışma gözlenen değişkene dayalı DMF kaynağının gizil değişken ile zayıf düzeyde ilişkisi olması durumunda, test sonuçlarının geçerliğini önemli ölçüde etkilemediğini göstermiştir.

Etik Beyan

Research Ethics All the guidelines set forth in the "Ethics Directive of Higher Education Institutions Scientific Research and Publication" were followed during the entire process, from the planning of this research to its implementation, from the collecting of data to its analysis. Nothing was done that violated the guidelines outlined in the second section of the heading "Actions Contrary to Scientific Research and Publication Ethics." The participants formally agreed to participate in the study, no changes were made to the data collected, and this work was not submitted to another academic publication medium for review. Scientific, ethical, and citation norms were observed throughout the writing process. Research ethics committee approval information Name of the committee that made the ethical evaluation: Ankara University Ethical Board Date of ethical review decision: 12/07/2019 Ethics assessment document issue number: 274

Teşekkür

Bu çalışma H. Deniz Gülleroğlu danışmanlığında Onurcan Ceyhan (2020) tarafından hazırlanan “PISA 2015 fen okuryazarlığı maddelerinin değişen madde fonksiyonunun gizil sınıf yaklaşımı ile incelenmesi” başlıklı yüksek lisans tezinden üretilmiştir.

Kaynakça

  • Ackerman, T. A. (1992). A didactic explanation of item bias, item impact, and item validity from a multidimensional perspective. Journal of educational measurement, 29(1), 67-91.
  • Adedoyin, O. O. (2010). Using IRT approach to detect gender biased items in public examinations: A case study from the Botswana junior certificate examination in mathematics. Educational Research and Reviews, 5(7), 385-399.
  • Akaike, H. (1973). Maximum likelihood identification of Gaussian autoregressive moving average models. Biometrika, 60(2), 255-265.
  • Akcan, R., & Atalay-Kabasakal, K. (2023). The impact of missing data on the performances of DIF detection methods. Journal of Measurement and Evaluation in Education and Psychology, 14(1), 95-105. https://doi.org/10.21031/epod.1183617
  • Allipour Birgani, S., & Shehni Yailagh, M. (2016). The causal relationship between teacher affective support with performance in English using the mediation of sense of belonging, enjoyment, hopelessness, self-efficacy, and academic effort. Development Strategies in Medical Education, 3(1), 49-59.
  • Anderman, E. M. (2002). School effects on psychological outcomes during adolescence. Journal of educational psychology, 94(4), 795.
  • Anil, D. (2009). Factors Effecting Science Achievement of Science Students in Programme for International Students’ Achievement (PISA) in Türkiye. Education and Science, 34 (152), 87-100.
  • Arıcı, Ö. (2019). Investigating the factors related to Turkish students' collaborative problem solving skills with mediatin models according to PISA 2015 results [Unpublished doctoral dissertation]. Ankara University, Ankara, Türkiye.
  • Aybek, E. C., Yaşar, M., & Kartal, S. (2021). Öğretmen Yapımı Bir Testteki Maddelerin Değişen Madde Fonksiyonu Bağlamında İncelenmesi. Pamukkale Üniversitesi Eğitim Fakültesi Dergisi(52), 281-300. https://doi.org/10.9779/pauefd.825631
  • Aydemir, F. (2023). PISA 2018 matematik ve fen bilimleri alt testlerinde değişen madde fonksiyonunun Rasch Ağacı, Mantel–Haenszel ve Lojistik Regresyon yöntemleriyle incelenmesi [Unpublished master’s thesis]. Gazi University, Ankara.
  • Bakan Kalaycıoğlu (2022). Gender-based differential item functioning analysis of the medical specialization education entrance examination. Journal of Measurement and Evaluation in Education and Psychology, 13(1), 1-13. https://doi.org/10.21031/epod.998592
  • Bayram, Ö. (2024). Bir tutum ölçeği üzerinden Mantel–Haenszel ve sıralı lojistik regresyon yöntemlerine göre değişen madde fonksiyonu incelenmesi [Unpublished master’s thesis]. Kocaeli University, Kocaeli.
  • Berberoğlu, G. (1995). Differential item functioning (DIF) Analysis of computation, word problem and geometry questions across gender and SES groups. Studies in Educational Evaluation, 21(4), 439-56.
  • Bernstein, B. B. (2000). Pedagogy, symbolic control, and identity: theory, research, critique. Lanham, MD: Rowman & Littlefield Publishers.
  • Booker, K. C. (2006). School belonging and the African American adolescent: What do we know and where should we go?. The High School Journal, 89(4), 1-7.
  • Chiu, M. M., & Xihua, Z. (2008). Family and motivation effects on mathematics achievement: Analyses of students in 41 countries. Learning and Instruction, 18(4), 321-336.
  • Cho, S.J. (2007). A multilevel mixture IRT model for DIF analysis [Unpublished doctoral dissertation]. Florida State University, Florida, USA. https://getd.libs.uga.edu/pdfs/cho_sun-joo_200712_phd.pdf. Retrived Date 18.12.2019
  • Cho, S.-J., & Cohen, A. S. (2010). A multilevel mixture irt model with an application to dif. Journal of educational and behavioral statistics, 35(3), 336–370. https://doi.org/10.3102/1076998609353111
  • Cho, S. J., Suh, Y., & Lee, W. Y. (2016). An NCME instructional module on latent DIF analysis using mixture item response models. Educational Measurement: Issues and Practice, 35(1), 48-61.
  • Clark, S. L., & Muthén, B. (2009). Relating latent class analysis results to variables not included in the analysis. https://www. statmodel. com/download /relatinglca.pdf
  • Clauser, B. E., & Mazor, K. M. (1998). Using statistical procedures to identify differentially functioning test items. Educational Measurement: İssues and Practice, 17(1), 31-44.
  • Cohen, A. S., & Bolt, D. M. (2005). A mixture model analysis of differential item functioning. Journal of Educational Measurement, 42(2), 133-148.
  • Çepni, Z. & Kelecioğlu, H. (2021). Detecting Differential Item Functioning Using SIBTEST, MH, LR and IRT Methods. Journal of Measurement and Evaluation in Education and Psychology, 12(3), 267-285. https://doi.org/10.21031/epod.988879
  • Çelik, M., & Özer Özkan, Y. (2020). PISA 2015 matematik alt testinin cinsiyet ve bölgelere göre değişen madde fonksiyonunun incelenmesi. Journal of Measurement and Evaluation in Education and Psychology, 11(3), 283-301. https://doi.org/10.21031/epod.715020
  • Dai, Y. (2013). A mixture Rasch model with a covariate: A simulation study via Bayesian Markov chain Monte Carlo estimation. Applied Psychological Measurement, 37(5), 375-396.
  • Daşçıoğlu, S., & Öğretmen, T. (2024). Detection of differential item functioning with latent class analysis: PISA 2018 mathematical literacy test. International Journal of Assessment Tools in Education, 11(2), 249–269. https://doi.org/10.21449/ijate.1387041
  • De Ayala, R. J., Kim, S. H., Stapleton, L. M., & Dayton, C. M. (2002). Differential item functioning: A mixture distribution conceptualization. International Journal of Testing, 2(3-4), 243-276.
  • DeMars, C. E. (1998). Gender differences in mathematics and science on a high school proficiency exam: The role of response format. Applied Measurement in Education, 11(3), 279-299.
  • DeMars, C. (2010). Item Response Theory: Understanding Statistics Measurement.. Oxford University Press.
  • Doğan, C. D. & Aybek, E. C. (2021). R-Shiny ile Psikometri ve İstatistik Uygulamaları. Pegem Akademi.
  • Entorf, H., & Tatsi, E. (2009). Migrants at school: educational inequality and social interaction in the UK and Germany, IZA DP No. 4175, Bonn, Germany.
  • Finch, W. H., & French, B. F. (2019). Educational and Psychological Measurement. New York: Routledge.
  • George, D. ve Mallery, M. (2010). SPSS for Windows Step by Step: A Simple Guide and Reference, 17.0 update (10a ed.) Boston: Pearson.
  • Gökdal, Ö. A. (2012). Examining Adolescents' Sense of School Belonging and Coping Strategies. [Unpublished master’s thesis]. Sakarya University, Sakarya, Türkiye.
  • Hambleton, R. K. Swaminathan. (1985). Item response theory. Boston: Springer Science & Business Media.
  • Harrington, D. (2009). Confirmatory factor analysis. Oxford university press, Oxford.
  • Hays, R. D., Morales, L. S., & Reise, S. P. (2000). Item response theory and health outcomes measurement in the 21st century. Medical care, 38(9 ), 11-28
  • Holland, P. W., & Wainer, H. (1993) Differential item functioning. 1993 Hillsdale, Lawrence Erlbaum Associates, Inc.
  • Kan, A., Sünbül, Ö., & Ömür, S. (2013). 6. - 8. Sınıf Seviye Belirleme Sınavları Alt Testlerinin Çeşitli Yöntemlere Göre Değişen Madde Fonksiyonlarının İncelenmesi. Mersin Üniversitesi Eğitim Fakültesi Dergisi, 9(2), 207-222. https://doi.org/10.17860/efd.55452
  • Kayrı, M. (2007). Two-Step Clustering Analysis in Researches: A Case Study. Eurasian Journal of Educational Research (EJER), (28),36-42.
  • Kucam, E., & Gülleroğlu, H. D. (2023). Examination of differential item functioning in PISA 2018 mathematics literacy test with different methods. Journal of Measurement and Evaluation in Education and Psychology, 14(2), 128-153. https://doi.org/10.21031/epod.1122857
  • Lord, F. M. (1980). Application of Item Response Theory to Practical Testing Problems. Hillsdale, NJ, Lawrence Erlbaum Ass.
  • Lord, F.M. & Novick, M.R. (1968) Statistical Theories of Mental Test Scores. Addison-Wesley, Menlo Park.
  • Maij-de Meij, A. M., Kelderman, H., & van der Flier, H. (2010). Improvement in detection of differential item functioning using a mixture item response theory model. Multivariate Behavioral Research, 45(6), 975-999.
  • Mantel, N., & Haenszel, W. (1959). Statistical aspects of the analysis of data from retrospective studies of disease. Journal of the National Cancer Institute, 22, 719-748.
  • M.E.B. (2016). PISA 2015 National report. Ankara: Ministry of National Education, General Directorate of Measurement, Evaluation and Examination Services.
  • McAllister, P. (1993). Testing, DIF, and public policy. P. Holland, & H. Wainer (Editors), Differetial Item Functioning (pp. 389-396). Hillsdale,NJ: Lawrence Erlbaum Associates.
  • Nagin, D. S., (2005). Group-based modeling of development. Harvard University Press.
  • OECD. (2015). International large-scale assessments: Origins, growth and why countries participate in PISA. Paris: OECD Publishing.
  • OECD. (2017). PISA 2015 Results (Volume III): Students’ Well-Being. Paris: OECD Publishing.
  • Öğretmen, T. (1995). Differential Item Functioning Analaysis of the Verbal Ability Section of the First Stage of the University Entrance Examination in Türkiye. [Unpublished master’s thesis]. Middle East Technical University, Ankara, Türkiye.
  • Rost, J. (1990). Rasch models in latent classes: An integration of two approaches to item analysis. Applied Psychological Measurement, 14(3), 271-282.
  • Saatçi̇oğlu, F. M. (2022). Differential item functioning across gender with MIMIC modeling: PISA 2018 financial literacy items. International Journal of Assessment Tools in Education, 9(3), 631–653. https://doi.org/10.21449/ijate.1076464
  • Samuelsen, K. M. (2005). Examining differential item functioning from a latent class perspective [Unpublished doctoral dissertation]. University of Maryland, Maryland, USA.
  • Sadak, M. (2022). TIMSS 2015 ve 2019 Matematik Sorularının Türkiye’de Cinsiyete Göre Madde Yanlılığının İncelenmesi: SIBTEST Prosedürü ile Değişen Madde Fonksiyonu Analizi. Ahi Evran Üniversitesi Kırşehir Eğitim Fakültesi Dergisi, 23(Özel Sayı), 211-258. https://doi.org/10.29299/kefad.961858
  • Schwarz, G. (1978). Estimating the dimension of a model. The Annals of Statistics, 6(2), 461-464.
  • Sırgancı, G. (2019). The Effect of Covariant Variable on Determination of Differential Item Functioning Using Mixture Rasch Model [Unpublished doctoral dissertation/master’s thesis]. Ankara University, Ankara, Türkiye.
  • Spiegelhalter, D. J., Best, N. G., Carlin, B. P., & Van Der Linde, A. (2002). Bayesian measures of model complexity and fit. Journal of The Royal Statistical Society: Series B (Statistical Methodology), 64(4), 583-639.
  • Tabachnick, B., & Fidell, L. (2013). Using Multivariate Statistics, 6th International edition (cover) edn. Boston,[Mass.].
  • Thissen, D., Steinberg, L., & Wainer, H. (1988). Use of item response theory in the study of group differences in trace lines. Test Validity. Hillsdale, NJ: Erlbaum, pg. 147-169
  • Toker, T. & Green, K. (2021). A comparison of latent class analysis and the mixture rasch model using 8th grade mathematics data in the fourth international mathematics and science study (timss-2011), International Journal of Assessment Tools in Education 8(4), 959–974
  • Toprak, E. ve Yakar, L. (2017). SBS 2011 Türkçe Alt Testindeki Maddelerin Değişen Madde Fonksiyonu Açısından Farklı Yöntemlerle İncelenmesi, International Journal Of Eurasia Social Sciences, Vol: 8, Issue: 26, pp. (220-231).
  • Unal, F. (2023). Farklı oranlardaki kayıp verilere farklı atama yöntemleriyle veri atamanın madde tepki kuramına dayalı yöntemlerle değişen madde fonksiyonuna etkisinin incelenmesi [Unpublished master’s thesis]. Akdeniz University, Antalya.
  • Usta, G. (2020). Sınav Kaygı Ölçeği Maddelerinin Çeşitli Yöntemlere Göre Değişen Madde Fonksiyonlarının İncelenmesi. Cumhuriyet Uluslararası Eğitim Dergisi, 9(4), 1225-1242. https://doi.org/10.30703/cije.70 3337
  • Uyar, Ş. (2015). Gözlenen gruplara ve örtük sınıflara göre belirlenen değişen madde fonksiyonunun karşılaştırılması [Unpublished doctoral dissertation]. Hacettepe University, Ankara.
  • Van Ryzin, M. J., Gravely, A. A., & Roseth, C. J. (2009). Autonomy, belongingness, and engagement in school as contributors to adolescent psychological well-being. Journal of Youth and Adolescence, 38(1), 1-12.
  • Visser, M., Juan, A., & Feza, N. (2015). Home and school resources as predictors of mathematics performance in South Africa. South African Journal of Education, 35(1), 28-36.
  • Waltz, C. F., Strickland, O. L., & Lenz, E. R. (2010). Measurement reliability. Measurement in Nursing and Health Research, 82(12)145-162.
  • Webb, N. L. (2006). Identifying content for student achievement tests. Handbook of test development, 155-180.
  • Yalçın, S. (2018). Data fit comparison of mixture item response theory models and traditional models. International Journal of Assessment Tools in Education, 5(2), 301-313.
  • Zimowski, M. F., Muraki, E., Mislevy, R. J., & Bock, R. D. (1996). BILOG-MG: Multiple-group IRT analysis and test maintenance for binary items. Chicago: Scientific Software.
  • Zitzmann, S., Lüdtke, O., & Robitzsch, A. (2015). A Bayesian approach to more stable estimates of group-level effects in contextual studies. Multivariate Behavioral Research, 50(6), 688-705.
  • Zumbo, B. D., & Gelin, M. N. (2005). A Matter of Test Bias in Educational Policy Research: Bringing the Context into Picture by Investigating Sociological/Community Moderated (or Mediated) Test and Item Bias. Journal of Educational Research & Policy Studies, 5(1), 1-23.

Investigation of differential item functioning of the PISA 2015 science literacy items with the latent class approach

Yıl 2025, Cilt: 8 Sayı: 1, 178 - 204, 31.05.2025
https://doi.org/10.33400/kuje.1158799

Öz

This study examines 11 items determined among the science literacy items of the PISA 2015 in terms of DIF using the latent class method. In the initial investigations, the response pattern met the assumptions of unidimensionality, monotony and local independence, which are the assumptions of the IRT for Türkiye study group consisting of 506 people. As a result of the estimations made with the IRT, it was concluded that items were suitable for the Mixed IRT 2PLM with two latent classes. The analyses were conducted based on the Mixed IRT 2PLM with two latent classes, and it was determined that approximately 91% of the respondents were in the first and 9% were in the second latent class. In the DIF analysis based on the Mixed IRT 2PLM with two latent classes, item 4 showed both non-uniform and uniform DIF. It was concluded that the items numbered 5, 6, 9, and 11 only showed uniform DIF. Item 4, 5, 6, 9, and 11 functioned differently in two latent classes. In the literature review, variables reported as DIF sources were included in the model separately as covariant variables. The percentage of classification accuracy increased among these covariant variables such as gender, ICT resources, enjoyment of science, and sense of belonging to the school. Therefore, these specified covariant variables serve as either a predictor of the latent variable or a source of a priori knowledge. Observed variables were added as covariant variables, DIF was determined as a source, and it was concluded that they were insufficient to explain latent classes. This study showed that if the DIF source based on the observed variable had a weak relationship with the latent variable, it did not significantly affect the validity of the test results.

Etik Beyan

All the rules stated in the "Higher Education Institutions Scientific Research and Publication Ethics Directive" were followed in the entire process from the planning, implementation, data collection to the analysis of the data. None of the actions specified under the second section of the Directive, "Scientific Research and Publication Ethics Actions" have been carried out. During the writing process of this study, scientific, ethical and citation rules were followed; no falsification was made on the collected data and this study was not sent to any other academic media for evaluation. All participants gave their informed consent to participate in the study, and it was approved by the local ethics committee of the Ankara University Declaration.

Teşekkür

This study is based on the master's thesis entitled “Investigation of the changing item function of PISA 2015 science literacy items with the latent class approach”, written by Onurcan Ceyhan (2020) under the supervision of H. Deniz Gülleroğlu.

Kaynakça

  • Ackerman, T. A. (1992). A didactic explanation of item bias, item impact, and item validity from a multidimensional perspective. Journal of educational measurement, 29(1), 67-91.
  • Adedoyin, O. O. (2010). Using IRT approach to detect gender biased items in public examinations: A case study from the Botswana junior certificate examination in mathematics. Educational Research and Reviews, 5(7), 385-399.
  • Akaike, H. (1973). Maximum likelihood identification of Gaussian autoregressive moving average models. Biometrika, 60(2), 255-265.
  • Akcan, R., & Atalay-Kabasakal, K. (2023). The impact of missing data on the performances of DIF detection methods. Journal of Measurement and Evaluation in Education and Psychology, 14(1), 95-105. https://doi.org/10.21031/epod.1183617
  • Allipour Birgani, S., & Shehni Yailagh, M. (2016). The causal relationship between teacher affective support with performance in English using the mediation of sense of belonging, enjoyment, hopelessness, self-efficacy, and academic effort. Development Strategies in Medical Education, 3(1), 49-59.
  • Anderman, E. M. (2002). School effects on psychological outcomes during adolescence. Journal of educational psychology, 94(4), 795.
  • Anil, D. (2009). Factors Effecting Science Achievement of Science Students in Programme for International Students’ Achievement (PISA) in Türkiye. Education and Science, 34 (152), 87-100.
  • Arıcı, Ö. (2019). Investigating the factors related to Turkish students' collaborative problem solving skills with mediatin models according to PISA 2015 results [Unpublished doctoral dissertation]. Ankara University, Ankara, Türkiye.
  • Aybek, E. C., Yaşar, M., & Kartal, S. (2021). Öğretmen Yapımı Bir Testteki Maddelerin Değişen Madde Fonksiyonu Bağlamında İncelenmesi. Pamukkale Üniversitesi Eğitim Fakültesi Dergisi(52), 281-300. https://doi.org/10.9779/pauefd.825631
  • Aydemir, F. (2023). PISA 2018 matematik ve fen bilimleri alt testlerinde değişen madde fonksiyonunun Rasch Ağacı, Mantel–Haenszel ve Lojistik Regresyon yöntemleriyle incelenmesi [Unpublished master’s thesis]. Gazi University, Ankara.
  • Bakan Kalaycıoğlu (2022). Gender-based differential item functioning analysis of the medical specialization education entrance examination. Journal of Measurement and Evaluation in Education and Psychology, 13(1), 1-13. https://doi.org/10.21031/epod.998592
  • Bayram, Ö. (2024). Bir tutum ölçeği üzerinden Mantel–Haenszel ve sıralı lojistik regresyon yöntemlerine göre değişen madde fonksiyonu incelenmesi [Unpublished master’s thesis]. Kocaeli University, Kocaeli.
  • Berberoğlu, G. (1995). Differential item functioning (DIF) Analysis of computation, word problem and geometry questions across gender and SES groups. Studies in Educational Evaluation, 21(4), 439-56.
  • Bernstein, B. B. (2000). Pedagogy, symbolic control, and identity: theory, research, critique. Lanham, MD: Rowman & Littlefield Publishers.
  • Booker, K. C. (2006). School belonging and the African American adolescent: What do we know and where should we go?. The High School Journal, 89(4), 1-7.
  • Chiu, M. M., & Xihua, Z. (2008). Family and motivation effects on mathematics achievement: Analyses of students in 41 countries. Learning and Instruction, 18(4), 321-336.
  • Cho, S.J. (2007). A multilevel mixture IRT model for DIF analysis [Unpublished doctoral dissertation]. Florida State University, Florida, USA. https://getd.libs.uga.edu/pdfs/cho_sun-joo_200712_phd.pdf. Retrived Date 18.12.2019
  • Cho, S.-J., & Cohen, A. S. (2010). A multilevel mixture irt model with an application to dif. Journal of educational and behavioral statistics, 35(3), 336–370. https://doi.org/10.3102/1076998609353111
  • Cho, S. J., Suh, Y., & Lee, W. Y. (2016). An NCME instructional module on latent DIF analysis using mixture item response models. Educational Measurement: Issues and Practice, 35(1), 48-61.
  • Clark, S. L., & Muthén, B. (2009). Relating latent class analysis results to variables not included in the analysis. https://www. statmodel. com/download /relatinglca.pdf
  • Clauser, B. E., & Mazor, K. M. (1998). Using statistical procedures to identify differentially functioning test items. Educational Measurement: İssues and Practice, 17(1), 31-44.
  • Cohen, A. S., & Bolt, D. M. (2005). A mixture model analysis of differential item functioning. Journal of Educational Measurement, 42(2), 133-148.
  • Çepni, Z. & Kelecioğlu, H. (2021). Detecting Differential Item Functioning Using SIBTEST, MH, LR and IRT Methods. Journal of Measurement and Evaluation in Education and Psychology, 12(3), 267-285. https://doi.org/10.21031/epod.988879
  • Çelik, M., & Özer Özkan, Y. (2020). PISA 2015 matematik alt testinin cinsiyet ve bölgelere göre değişen madde fonksiyonunun incelenmesi. Journal of Measurement and Evaluation in Education and Psychology, 11(3), 283-301. https://doi.org/10.21031/epod.715020
  • Dai, Y. (2013). A mixture Rasch model with a covariate: A simulation study via Bayesian Markov chain Monte Carlo estimation. Applied Psychological Measurement, 37(5), 375-396.
  • Daşçıoğlu, S., & Öğretmen, T. (2024). Detection of differential item functioning with latent class analysis: PISA 2018 mathematical literacy test. International Journal of Assessment Tools in Education, 11(2), 249–269. https://doi.org/10.21449/ijate.1387041
  • De Ayala, R. J., Kim, S. H., Stapleton, L. M., & Dayton, C. M. (2002). Differential item functioning: A mixture distribution conceptualization. International Journal of Testing, 2(3-4), 243-276.
  • DeMars, C. E. (1998). Gender differences in mathematics and science on a high school proficiency exam: The role of response format. Applied Measurement in Education, 11(3), 279-299.
  • DeMars, C. (2010). Item Response Theory: Understanding Statistics Measurement.. Oxford University Press.
  • Doğan, C. D. & Aybek, E. C. (2021). R-Shiny ile Psikometri ve İstatistik Uygulamaları. Pegem Akademi.
  • Entorf, H., & Tatsi, E. (2009). Migrants at school: educational inequality and social interaction in the UK and Germany, IZA DP No. 4175, Bonn, Germany.
  • Finch, W. H., & French, B. F. (2019). Educational and Psychological Measurement. New York: Routledge.
  • George, D. ve Mallery, M. (2010). SPSS for Windows Step by Step: A Simple Guide and Reference, 17.0 update (10a ed.) Boston: Pearson.
  • Gökdal, Ö. A. (2012). Examining Adolescents' Sense of School Belonging and Coping Strategies. [Unpublished master’s thesis]. Sakarya University, Sakarya, Türkiye.
  • Hambleton, R. K. Swaminathan. (1985). Item response theory. Boston: Springer Science & Business Media.
  • Harrington, D. (2009). Confirmatory factor analysis. Oxford university press, Oxford.
  • Hays, R. D., Morales, L. S., & Reise, S. P. (2000). Item response theory and health outcomes measurement in the 21st century. Medical care, 38(9 ), 11-28
  • Holland, P. W., & Wainer, H. (1993) Differential item functioning. 1993 Hillsdale, Lawrence Erlbaum Associates, Inc.
  • Kan, A., Sünbül, Ö., & Ömür, S. (2013). 6. - 8. Sınıf Seviye Belirleme Sınavları Alt Testlerinin Çeşitli Yöntemlere Göre Değişen Madde Fonksiyonlarının İncelenmesi. Mersin Üniversitesi Eğitim Fakültesi Dergisi, 9(2), 207-222. https://doi.org/10.17860/efd.55452
  • Kayrı, M. (2007). Two-Step Clustering Analysis in Researches: A Case Study. Eurasian Journal of Educational Research (EJER), (28),36-42.
  • Kucam, E., & Gülleroğlu, H. D. (2023). Examination of differential item functioning in PISA 2018 mathematics literacy test with different methods. Journal of Measurement and Evaluation in Education and Psychology, 14(2), 128-153. https://doi.org/10.21031/epod.1122857
  • Lord, F. M. (1980). Application of Item Response Theory to Practical Testing Problems. Hillsdale, NJ, Lawrence Erlbaum Ass.
  • Lord, F.M. & Novick, M.R. (1968) Statistical Theories of Mental Test Scores. Addison-Wesley, Menlo Park.
  • Maij-de Meij, A. M., Kelderman, H., & van der Flier, H. (2010). Improvement in detection of differential item functioning using a mixture item response theory model. Multivariate Behavioral Research, 45(6), 975-999.
  • Mantel, N., & Haenszel, W. (1959). Statistical aspects of the analysis of data from retrospective studies of disease. Journal of the National Cancer Institute, 22, 719-748.
  • M.E.B. (2016). PISA 2015 National report. Ankara: Ministry of National Education, General Directorate of Measurement, Evaluation and Examination Services.
  • McAllister, P. (1993). Testing, DIF, and public policy. P. Holland, & H. Wainer (Editors), Differetial Item Functioning (pp. 389-396). Hillsdale,NJ: Lawrence Erlbaum Associates.
  • Nagin, D. S., (2005). Group-based modeling of development. Harvard University Press.
  • OECD. (2015). International large-scale assessments: Origins, growth and why countries participate in PISA. Paris: OECD Publishing.
  • OECD. (2017). PISA 2015 Results (Volume III): Students’ Well-Being. Paris: OECD Publishing.
  • Öğretmen, T. (1995). Differential Item Functioning Analaysis of the Verbal Ability Section of the First Stage of the University Entrance Examination in Türkiye. [Unpublished master’s thesis]. Middle East Technical University, Ankara, Türkiye.
  • Rost, J. (1990). Rasch models in latent classes: An integration of two approaches to item analysis. Applied Psychological Measurement, 14(3), 271-282.
  • Saatçi̇oğlu, F. M. (2022). Differential item functioning across gender with MIMIC modeling: PISA 2018 financial literacy items. International Journal of Assessment Tools in Education, 9(3), 631–653. https://doi.org/10.21449/ijate.1076464
  • Samuelsen, K. M. (2005). Examining differential item functioning from a latent class perspective [Unpublished doctoral dissertation]. University of Maryland, Maryland, USA.
  • Sadak, M. (2022). TIMSS 2015 ve 2019 Matematik Sorularının Türkiye’de Cinsiyete Göre Madde Yanlılığının İncelenmesi: SIBTEST Prosedürü ile Değişen Madde Fonksiyonu Analizi. Ahi Evran Üniversitesi Kırşehir Eğitim Fakültesi Dergisi, 23(Özel Sayı), 211-258. https://doi.org/10.29299/kefad.961858
  • Schwarz, G. (1978). Estimating the dimension of a model. The Annals of Statistics, 6(2), 461-464.
  • Sırgancı, G. (2019). The Effect of Covariant Variable on Determination of Differential Item Functioning Using Mixture Rasch Model [Unpublished doctoral dissertation/master’s thesis]. Ankara University, Ankara, Türkiye.
  • Spiegelhalter, D. J., Best, N. G., Carlin, B. P., & Van Der Linde, A. (2002). Bayesian measures of model complexity and fit. Journal of The Royal Statistical Society: Series B (Statistical Methodology), 64(4), 583-639.
  • Tabachnick, B., & Fidell, L. (2013). Using Multivariate Statistics, 6th International edition (cover) edn. Boston,[Mass.].
  • Thissen, D., Steinberg, L., & Wainer, H. (1988). Use of item response theory in the study of group differences in trace lines. Test Validity. Hillsdale, NJ: Erlbaum, pg. 147-169
  • Toker, T. & Green, K. (2021). A comparison of latent class analysis and the mixture rasch model using 8th grade mathematics data in the fourth international mathematics and science study (timss-2011), International Journal of Assessment Tools in Education 8(4), 959–974
  • Toprak, E. ve Yakar, L. (2017). SBS 2011 Türkçe Alt Testindeki Maddelerin Değişen Madde Fonksiyonu Açısından Farklı Yöntemlerle İncelenmesi, International Journal Of Eurasia Social Sciences, Vol: 8, Issue: 26, pp. (220-231).
  • Unal, F. (2023). Farklı oranlardaki kayıp verilere farklı atama yöntemleriyle veri atamanın madde tepki kuramına dayalı yöntemlerle değişen madde fonksiyonuna etkisinin incelenmesi [Unpublished master’s thesis]. Akdeniz University, Antalya.
  • Usta, G. (2020). Sınav Kaygı Ölçeği Maddelerinin Çeşitli Yöntemlere Göre Değişen Madde Fonksiyonlarının İncelenmesi. Cumhuriyet Uluslararası Eğitim Dergisi, 9(4), 1225-1242. https://doi.org/10.30703/cije.70 3337
  • Uyar, Ş. (2015). Gözlenen gruplara ve örtük sınıflara göre belirlenen değişen madde fonksiyonunun karşılaştırılması [Unpublished doctoral dissertation]. Hacettepe University, Ankara.
  • Van Ryzin, M. J., Gravely, A. A., & Roseth, C. J. (2009). Autonomy, belongingness, and engagement in school as contributors to adolescent psychological well-being. Journal of Youth and Adolescence, 38(1), 1-12.
  • Visser, M., Juan, A., & Feza, N. (2015). Home and school resources as predictors of mathematics performance in South Africa. South African Journal of Education, 35(1), 28-36.
  • Waltz, C. F., Strickland, O. L., & Lenz, E. R. (2010). Measurement reliability. Measurement in Nursing and Health Research, 82(12)145-162.
  • Webb, N. L. (2006). Identifying content for student achievement tests. Handbook of test development, 155-180.
  • Yalçın, S. (2018). Data fit comparison of mixture item response theory models and traditional models. International Journal of Assessment Tools in Education, 5(2), 301-313.
  • Zimowski, M. F., Muraki, E., Mislevy, R. J., & Bock, R. D. (1996). BILOG-MG: Multiple-group IRT analysis and test maintenance for binary items. Chicago: Scientific Software.
  • Zitzmann, S., Lüdtke, O., & Robitzsch, A. (2015). A Bayesian approach to more stable estimates of group-level effects in contextual studies. Multivariate Behavioral Research, 50(6), 688-705.
  • Zumbo, B. D., & Gelin, M. N. (2005). A Matter of Test Bias in Educational Policy Research: Bringing the Context into Picture by Investigating Sociological/Community Moderated (or Mediated) Test and Item Bias. Journal of Educational Research & Policy Studies, 5(1), 1-23.
Toplam 73 adet kaynakça vardır.

Ayrıntılar

Birincil Dil İngilizce
Konular Eğitimde Ölçme ve Değerlendirme (Diğer), Alan Eğitimleri
Bölüm Araştırma Makaleleri
Yazarlar

Onurcan Ceyhan 0000-0002-5077-6806

Hamide Deniz Gülleroğlu 0000-0001-6995-8223

Yayımlanma Tarihi 31 Mayıs 2025
Gönderilme Tarihi 7 Ağustos 2022
Yayımlandığı Sayı Yıl 2025 Cilt: 8 Sayı: 1

Kaynak Göster

APA Ceyhan, O., & Gülleroğlu, H. D. (2025). Investigation of differential item functioning of the PISA 2015 science literacy items with the latent class approach. Kocaeli Üniversitesi Eğitim Dergisi, 8(1), 178-204. https://doi.org/10.33400/kuje.1158799



22176

Kocaeli Üniversitesi Eğitim Dergisi 2020 yılı itibariyle TR-Dizin tarafından dizinlenmektedir.