Systematic Reviews and Meta Analysis
BibTex RIS Cite

Development of an Achievement Test: Seventh Grade Electric Circuits

Year 2025, Volume: 45 Issue: 1, 229 - 278, 30.04.2025
https://doi.org/10.17152/gefad.1506639

Abstract

This research aims to determine the bibliometric profiles of simulation-based studies in Item Response Theory. The type of research is classified as descriptive studies. Data were obtained from Web of Science and Scopus databases. While searching databases, the concepts of "item response theory" and "simulation" were used in topic fields. Accordingly, considering the inclusion criteria, 1246 research articles from WoS database and 1524 from Scopus database were accessed. At the same time, 923 articles in both databases were excluded, and the research was continued with 1635 articles. The Bibliometrix package was used to analyze the data. Accordingly, 1635 articles from 314 sources were accessed between 01/01/1982 and 31/03/2024. The number of researchers who conducted these articles is 2613. It was observed that the articles in the related field were mostly published in Applied Psychological Measurement; the most cited source was Psychometrica; the most published researcher was Wang, Wen-Chun; the most cited researcher was Cai, Li; the most published institution was Northeast Normal University (China); the most published article was USA; the most cited article was "A Bayesian random effects model for testlets"; the most used keywords were "item response theory"; and the most collaborative articles were between USA and China. It is recommended that researchers examine the bibliometric profiles revealed in this research.

References

  • Aksu, G. & Guzeller, C.O. (2019). Analysis of scientific studies on item response theory by bibliometric analysis method. International Journal of Progressive Education, 15(2), 44-64. doi: 10.29329/ijpe.2019.189.4
  • Al, U. (2012). Avrupa birliği ülkeleri ve Türkiye’nin yayın ve atıf performansı. Bilig, 62, 1-20.
  • Alonso, S., Cabrerizo, F. J., Herrera-Viedma, E., & Herrera, F. (2009). h-Index: A review focused in its variants, computation and standardization for different scientific fields. J. Informetr, 3, 273–289. doi: https://doi.org/10.1016/j.joi.2009.04.001
  • Anastasi, A. (1988). Psychological testing. (5th ed.). New York: MacMillan Pub. Co. Inc. Aria, M., & Cuccurullo, C. (2017). Bibliometrix: An R-tool for comprehensive science mapping analysis. Journal of Informetrics 11(4), 959–975. doi: https://doi.org/10.1016/j.joi.2017.08.007
  • Baker, F.B. (2001). The basics of item response theory (2nd ed.). College Park, (MD): ERIC Clearinghouse on Assessment and Evaluation.
  • Baker, F.B., & Kim, S. (2004). Item response theory: Parameter estimation techniques (2nd ed.). New York: Marcel Dekker.
  • Bornmann, L. & Daniel, H.D. (2007). What do we know about the h index?. Journal of the American Society for Information Science and Technology, 58(9), 1381-1385. doi: https://doi.org/10.1002/asi.20609
  • Bulut, O., & Sünbül, Ö. (2017). R programlama dili ile madde tepki kuramında monte carlo simülasyon çalışmaları. Journal of Measurement and Evaluation in Education and Psychology, 8(3), 266-287. doi: https://doi.org/10.21031/epod.305821
  • Büyükkıdık, S. (2022). A bibliometric analysis: A tutorial for the bibliometrix package in R using IRT literature. Journal of Measurement and Evaluation in Education and Psychology, 13(3), 164-193. doi: https://doi.org/10.21031/epod.1069307
  • Camilli, G., & Shepard, L.A. (1994). Methods for identifying biased test items. California: Sage.
  • Chadegani A.A., Salehi H, Yunus MM, Farhadi H, Fooladi M, Farhadi M, Ebrahim, N.A. (2013). A comparison between two main academic literature collections:Web of Science and Scopus databases. Asian Soc Sci 9(5), 18–26.
  • Chalmers, R.P. (2012). mirt: A multidimensional item response theory package for the R environment. Journal of Statistical Software, 48(6), 1–29. doi: https://doi.org/10.18637/jss.v048.i06
  • Crocker, L., & Algina, L. (1986). Introduction to classical and modern test theory. New York: Holt, Rinehart and Winston.
  • Çakıcı-Eser, D. (2015). Çok Boyutlu Madde Tepki Kuramının Farklı Modellerinden Çeşitli Koşullar Altında Kestirilen Parametrelerin İncelenmesi [Yayımlanmamış doktora tezi]. Hacettepe Üniversitesi Eğitim Bilimleri Enstitüsü, Ankara.
  • Davis, J.P., Eisenhardt, K.M., & Bingham, C.B. (2007). Developing theory through simulation methods. The Academy of Management Review, 32(2), 480–499. doi: https://doi.org/10.2307/20159312
  • Dean, J., & Ghemawat, S. (2008). MapReduce: Simplified data processing on large clusters. Communications of the ACM, 51(1), 107-113.
  • De Ayala, R.J. (2009). The theory and practice of item response theory. New York, (NY): The Guilford Press.
  • Egghe, L. (2013). Note on a possible decomposition of the h-Index. Journal of the American Society for Information Science and Technology, 64(4), 871-871. doi: https://doi.org/10.1002/asi.22855
  • Embreston, S.E., & Reise, S.P. (2000). Item response theory for psychologists. Lawrence: Erlbaum Ass., Inc. Erkuş, A. (2021). Davranış bilimleri için bilimsel araştırma süreci. (7. Baskı). Ankara: Seçkin.
  • Fraenkel, J.R., & Wallen, E. (2009). How to design and evaluate research in education. New York, (NY): McGraw-Hils Companies.
  • Garfield, E. (1980). Bradford’s Law and related statistical patterns. Essays of an Information Scientist, 4, 476-483. http://www.garfield.library.upenn.edu/essays/v4p476y1979-80.pdf
  • Glanzel, W. (2003). Bibliometrics as a research field: A course on Theory and Application of Bibliometric Indicators (Course Handout). Retrieved from: https://www.researchgate.net/publication/378543284_Modelo_para_o_desenvolvimento_de_estrategia_de_busca_para_a_recuperacao_da_informacao_cientifica_e_tecnologica_o_caso_da_impressao_3D Han, K.T. (2007). WinGen2: Windows software that generates IRT parameters and item responses [computer program]. Amherst, (MA): University of Massachusetts, Center for Educational Assessment.
  • Hambleton, R.K., & Jones R.W. (1993). An NCME instructional module on comparison of classical test theory and item response theory and their applications to test development. Educational Measurement: Issues and Practice, 12(3), 38-47. doi: https://doi.org/10.1111/j.1745-3992.1993.tb00543.x
  • Hambleton, R.K., & Swaminathan H. (1985). Item response theory: Principals and applications. Norwell, (MA): Kluwer Academic Publishers.
  • Hambleton, R.K., Swaminathan, H., & Rogers, H.J. (1991). Fundamentals of item response theory. California: Sage Publications Inc.
  • Harwell, M., Stone, C.A., Hsu, T.-C., & Kirisci, L. (1996). Monte Carlo studies in item response theory. Applied Psychological Measurement, 20(2), 101-125. doi: https://doi.org/10.1177/014662169602000201
  • Harwell, M., Kohli, N., & Peralta, Y. (2017). Experimental design and data analysis in computer simulation studies in the behavioral sciences. Journal of Modern Applied Statistical Methods, 16(2), 3-28.
  • Hirsch, J.E. (2005). An index to quantify an individual's scientific research output. Proceedings of the National academy of Sciences, 102(46), 16569-16572. doi: https://doi.org/10.1073/pnas.0507655102
  • Hoaglin, D.C., & Andrews, D.F. (1975). The reporting of computation-based results in statistics. The American Statistician, 29, 122-126.
  • Holland, P.W., & Wainer, H. (1993). Differantial item functioning. Educational Testing Service.
  • Horn, J.L. & McArdle, J.J. (1992). A practical and theoretical guide to measurement invariance in aging research. Experimental Aging Research, 18(3), 117–144.
  • Ihaka, R., & Gentleman, R. (1996). R: A language for data analysis and graphics. Journal of Computational and Graphical Statistics, 5(3), 299-314.
  • İnceoğlu, Ç. (2014). Türkiye’de sinemayı konu alan doktora tezleri üzerine bibliyometrik bir çözümleme. Galatasaray Üniversitesi İletişim Dergisi, (21), 31-50. doi: https://doi.org/10.16878/gsuilet.96674
  • Jacso, P. (2005). As we may search--comparison of major features of the Web of Science, Scopus, and Google Scholar citation-based and citation-enhanced databases. Current Science, 89(9), 1537-1547.
  • Karadavut, T. (2019). The uniform prior for bayesian estimation of ability in item response theory models. International Journal of Assessment Tools in Education, 6(4), 568-579.
  • Kelecioğlu, H. (2024). Test eşitleme. Ankara: EPODDER.
  • Kothari, C.R. (2004). Research methodology: methods and techniques (2nd. ed.). New Delhi: New Age International Publishers.
  • Kousha, K., & Thelwall, M. (2007). Google Scholar citations and Google Web/URL citations: A multi-discipline exploratory analysis. Journal of the American Society for Information Science and Technology, 58(7), 1055-1065.
  • Köroğlu, S.A. (2015). Literatür taraması üzerine notlar ve bir tarama tekniği. GİDB Dergi (01), 61-69.
  • Li, K., Rollins, J., & Yan, E. (2018) Web of science use in published research and review papers 1997-2017: A selective, dynamic, cross-domain, content-based analysis. Scientometrics, 115, 1–20. doi: https://doi.org/10.1007/s11192-017-2622-5
  • Livingston, S.A. (2004). Equating test scores (without IRT). Educational Testing Service.
  • Medeiros, E.L.L., G. de Sá, P.G. Ambrosio, L.P. Guimarães Filho, V.M. Bristot, K. Madeira, C.K., Yamaguchi, F.C.S. Ferreira, and S.F. Stefenon. 2018. Three-parameter logistic model (ML3): A bibliometrics analysis. International Journal of Advanced Engineering Research and Science 5(4),128-134. doi: https://doi.org/10.22161/ijaers.5.4.18.
  • Meredith, W. (1993), Measurement invariance, factor analysis, and factorial invariance. Pyschometrika, 58, 525–543.
  • Moher, D., Liberati, A., Tetzlaff, J., Altman, D.G., & Prisma Group. (2009). Reprint-preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. Physical Therapy, 89(9), 873-880. doi: https://doi.org/10.1093/ptj/89.9.873
  • Morris, T.P., White, I.R., & Crowther, M.J. (2019). Using simulation studies to evaluate statistical methods. Tutorial in Biostatistics, 38(11), 2074-2102.
  • Ndwandwe, M., Bishop, D.G., Wise, R., & Rodseth, R. (2021). Bibliometrics to assess the productivity and impact of medical research. South African Journal of Higher Education 35(4), 224-36. doi: https://doi.org/10.20853/35-4-4341.
  • Nydick, S. (2022). CATIRT: Simulate IRT-Based Computerized Adaptive Tests [Dataset]. doi: https://doi.org/10.32614/cran.package.catirt
  • Partchev, I. (2022). irtoys: A collection of functions related to item response theory (IRT) [Computer software]. Available from https://CRAN.R-project.org/package=irtoys.
  • Ranganathan, K., & Foster, I. (2003). Simulation studies of computation and data scheduling algorithms for data grids. Journal of Grid Computing 1, 53–62. doi: https://doi.org/10.1023/A:1024035627870
  • Revelle, W. (2025). psych: Procedures for Psychological, Psychometric, and Personality Research [Dataset]. doi: https://doi.org/10.32614/cran.package.psych
  • Rizopoulos, D. (2006). ltm: An R package for latent variable modeling and item response analysis. Journal of Statistical Software, 17(5), 1–25. doi: https://doi.org/10.18637/jss.v017.i05
  • Selçuk, E. & Demir, E. (2023a). R yazılım dili paketleriyle üretilen simülatif verilerin madde tepki kuramı varsayımları açısından incelenmesi. 10. International Eurasian Educational Research Congress içinde (s. 177-195). Anı yayıncılık. https://drive.google.com/file/d/19JpqAb9eQU-SXHfSNhecGfplJmFmfiR5/view
  • Selçuk, E., & Demir, E. (2023b). Eğitimde ölçme ve değerlendirme alanında veri simülasyonuyla yapılan lisansüstü tezlerin tematik ve metodolojik açıdan incelenmesi. Internatıonal Educatıon Congress içinde (s. 709-719). Edu yayıncılık. https://educongress.org/wp-content/uploads/2023/10/EDUCONGRESS-2023CONFERENCE-PROCEEDING.pdf
  • Selçuk, E., & Demir, E. (2024). Comparison of item response theory ability and item parameters according to classical and Bayesian estimation methods. International Journal of Assessment Tools in Education, 11(2), 213-248. doi: https://doi.org/10.21449/ijate.1290831
  • van Rossum, G., & Drake, F.L. (2009). Python 3 Reference Manual. Python Software Foundation.
  • von Davier, A.A. (2011). Statistical models for test equating, scaling, and linking. USA: Springer.
  • Weiss, D.J. & Kingsbury, G.G. (1984). Application of computer adaptive testing to educational problems. Journal of Educational Measurement, 21(4), 361–375. https://doi.org/10.1111/j.1745-3984.1984.tb01040.x
  • Yurtçu, M., & Güzeller, C. (2021). Bibliometric analysis of articles on computerized adaptive testing. Participatory Educational Research, 8(4), 426-438. https://doi.org/10.17275/per.21.98.8.4
  • Zafrullah, Z., Bakti, A.A., Riantoro, E.S., Kastara, R., Prasetyo, Y.B.A., Rosidah, R., Fitriani, A., Fitria, R.L., Ramadhani, A.M., and Ulwiyah, S. (2023a). Item response theory in education: a biblioshiny analysis (1987-2023). Journal of Education Global, 1(1), 101–114.
  • Zaharia, M., Chowdhury, M., Franklin, M.J., Shenker, S., & Stoica, I. (2016). Spark: Cluster computing with working sets. Proceedings of the 2nd USENIX Conference on Hot Topics in Cloud Computing (HotCloud).
  • Zupic, I., & Cater, T. (2015). Bibliometric methods in management and organization. Organizational Research Methods, 18(3), 429-472. doi: https://doi.org/10.1177/1094428114562629

Madde Tepki Kuramı Alanında Yapılan Simülasyon Tabanlı Çalışmaların Bibliyometrik Analizi

Year 2025, Volume: 45 Issue: 1, 229 - 278, 30.04.2025
https://doi.org/10.17152/gefad.1506639

Abstract

Bu çalışmada Madde Tepki Kuramı (MTK) alanında yapılan simülasyon tabanlı çalışmaların bibliyometrik profillerinin incelenmesi amaçlanmıştır. Çalışmanın türü betimleyici araştırmalar sınıfına girmektedir. Çalışma kapsamında veriler Web of Science (WoS) ve Scopus veri tabanlarından elde edilmiştir. Veri tabanında tarama yapılırken topik alanlarda (başlık, özet ve anahtar sözcükler), “item response theory” ve “simulation” kavramları kullanılmıştır. Buna göre dahil etme ölçütleri dikkate alınarak WoS veri tabanından 1246, Scopus veri tabanından 1524 araştırma makalesine ulaşılmıştır. Aynı anda iki veri tabanında da bulunan 923 çalışma ayıklanmış ve çalışmaya 1635 makaleyle devam edilmiştir. Verilerin analizinde R diliyle geliştirilmiş Bibliometrix paketi kullanılmıştır. Buna göre 01/01/1982 ile 31/03/2024 tarihleri arasında 314 kaynaktan 1635 çalışmaya ulaşılmıştır. İlgili alanda yapılan çalışmaların en çok “Applied Psychological Measurement” dergisinde yayınlandığı, en çok atıf alan kaynağın “Psychometrica”; en çok çalışma üreten araştırmacının Wang, Wen-Chun; en çok atıf alan araştırmacının Cai, Li; en çok çalışma üreten üniversitenin Northeast Normal Üniversitesi (Çin); en çok yayının ABD; en çok atıf alan makalenin “A Bayesian random effects model for testlets”; çalışmalarda en çok kullanılan anahtar sözcüğün “item response theory”; en çok ortak çalışmanın ABD ile Çin arasında olduğu görülmüştür. Buna göre madde tepki kuramı alanında simülasyonla çalışacak araştırmacıların bu çalışmada ortaya çıkan bibliyometrik profilleri incelemesi önerilmektedir.

Ethical Statement

Bu çalışma herhangi bir canlı grubu üzerinde gerçekleştirilmemiştir. Çalışmanın verileri Web of Science ve Scopus veri tabanlarından alınmıştır. Dolayısıyla bu bir doküman analizi çalışmasıdır. Bu nedenle etik kurul iznine gerek duyulmamıştır.

References

  • Aksu, G. & Guzeller, C.O. (2019). Analysis of scientific studies on item response theory by bibliometric analysis method. International Journal of Progressive Education, 15(2), 44-64. doi: 10.29329/ijpe.2019.189.4
  • Al, U. (2012). Avrupa birliği ülkeleri ve Türkiye’nin yayın ve atıf performansı. Bilig, 62, 1-20.
  • Alonso, S., Cabrerizo, F. J., Herrera-Viedma, E., & Herrera, F. (2009). h-Index: A review focused in its variants, computation and standardization for different scientific fields. J. Informetr, 3, 273–289. doi: https://doi.org/10.1016/j.joi.2009.04.001
  • Anastasi, A. (1988). Psychological testing. (5th ed.). New York: MacMillan Pub. Co. Inc. Aria, M., & Cuccurullo, C. (2017). Bibliometrix: An R-tool for comprehensive science mapping analysis. Journal of Informetrics 11(4), 959–975. doi: https://doi.org/10.1016/j.joi.2017.08.007
  • Baker, F.B. (2001). The basics of item response theory (2nd ed.). College Park, (MD): ERIC Clearinghouse on Assessment and Evaluation.
  • Baker, F.B., & Kim, S. (2004). Item response theory: Parameter estimation techniques (2nd ed.). New York: Marcel Dekker.
  • Bornmann, L. & Daniel, H.D. (2007). What do we know about the h index?. Journal of the American Society for Information Science and Technology, 58(9), 1381-1385. doi: https://doi.org/10.1002/asi.20609
  • Bulut, O., & Sünbül, Ö. (2017). R programlama dili ile madde tepki kuramında monte carlo simülasyon çalışmaları. Journal of Measurement and Evaluation in Education and Psychology, 8(3), 266-287. doi: https://doi.org/10.21031/epod.305821
  • Büyükkıdık, S. (2022). A bibliometric analysis: A tutorial for the bibliometrix package in R using IRT literature. Journal of Measurement and Evaluation in Education and Psychology, 13(3), 164-193. doi: https://doi.org/10.21031/epod.1069307
  • Camilli, G., & Shepard, L.A. (1994). Methods for identifying biased test items. California: Sage.
  • Chadegani A.A., Salehi H, Yunus MM, Farhadi H, Fooladi M, Farhadi M, Ebrahim, N.A. (2013). A comparison between two main academic literature collections:Web of Science and Scopus databases. Asian Soc Sci 9(5), 18–26.
  • Chalmers, R.P. (2012). mirt: A multidimensional item response theory package for the R environment. Journal of Statistical Software, 48(6), 1–29. doi: https://doi.org/10.18637/jss.v048.i06
  • Crocker, L., & Algina, L. (1986). Introduction to classical and modern test theory. New York: Holt, Rinehart and Winston.
  • Çakıcı-Eser, D. (2015). Çok Boyutlu Madde Tepki Kuramının Farklı Modellerinden Çeşitli Koşullar Altında Kestirilen Parametrelerin İncelenmesi [Yayımlanmamış doktora tezi]. Hacettepe Üniversitesi Eğitim Bilimleri Enstitüsü, Ankara.
  • Davis, J.P., Eisenhardt, K.M., & Bingham, C.B. (2007). Developing theory through simulation methods. The Academy of Management Review, 32(2), 480–499. doi: https://doi.org/10.2307/20159312
  • Dean, J., & Ghemawat, S. (2008). MapReduce: Simplified data processing on large clusters. Communications of the ACM, 51(1), 107-113.
  • De Ayala, R.J. (2009). The theory and practice of item response theory. New York, (NY): The Guilford Press.
  • Egghe, L. (2013). Note on a possible decomposition of the h-Index. Journal of the American Society for Information Science and Technology, 64(4), 871-871. doi: https://doi.org/10.1002/asi.22855
  • Embreston, S.E., & Reise, S.P. (2000). Item response theory for psychologists. Lawrence: Erlbaum Ass., Inc. Erkuş, A. (2021). Davranış bilimleri için bilimsel araştırma süreci. (7. Baskı). Ankara: Seçkin.
  • Fraenkel, J.R., & Wallen, E. (2009). How to design and evaluate research in education. New York, (NY): McGraw-Hils Companies.
  • Garfield, E. (1980). Bradford’s Law and related statistical patterns. Essays of an Information Scientist, 4, 476-483. http://www.garfield.library.upenn.edu/essays/v4p476y1979-80.pdf
  • Glanzel, W. (2003). Bibliometrics as a research field: A course on Theory and Application of Bibliometric Indicators (Course Handout). Retrieved from: https://www.researchgate.net/publication/378543284_Modelo_para_o_desenvolvimento_de_estrategia_de_busca_para_a_recuperacao_da_informacao_cientifica_e_tecnologica_o_caso_da_impressao_3D Han, K.T. (2007). WinGen2: Windows software that generates IRT parameters and item responses [computer program]. Amherst, (MA): University of Massachusetts, Center for Educational Assessment.
  • Hambleton, R.K., & Jones R.W. (1993). An NCME instructional module on comparison of classical test theory and item response theory and their applications to test development. Educational Measurement: Issues and Practice, 12(3), 38-47. doi: https://doi.org/10.1111/j.1745-3992.1993.tb00543.x
  • Hambleton, R.K., & Swaminathan H. (1985). Item response theory: Principals and applications. Norwell, (MA): Kluwer Academic Publishers.
  • Hambleton, R.K., Swaminathan, H., & Rogers, H.J. (1991). Fundamentals of item response theory. California: Sage Publications Inc.
  • Harwell, M., Stone, C.A., Hsu, T.-C., & Kirisci, L. (1996). Monte Carlo studies in item response theory. Applied Psychological Measurement, 20(2), 101-125. doi: https://doi.org/10.1177/014662169602000201
  • Harwell, M., Kohli, N., & Peralta, Y. (2017). Experimental design and data analysis in computer simulation studies in the behavioral sciences. Journal of Modern Applied Statistical Methods, 16(2), 3-28.
  • Hirsch, J.E. (2005). An index to quantify an individual's scientific research output. Proceedings of the National academy of Sciences, 102(46), 16569-16572. doi: https://doi.org/10.1073/pnas.0507655102
  • Hoaglin, D.C., & Andrews, D.F. (1975). The reporting of computation-based results in statistics. The American Statistician, 29, 122-126.
  • Holland, P.W., & Wainer, H. (1993). Differantial item functioning. Educational Testing Service.
  • Horn, J.L. & McArdle, J.J. (1992). A practical and theoretical guide to measurement invariance in aging research. Experimental Aging Research, 18(3), 117–144.
  • Ihaka, R., & Gentleman, R. (1996). R: A language for data analysis and graphics. Journal of Computational and Graphical Statistics, 5(3), 299-314.
  • İnceoğlu, Ç. (2014). Türkiye’de sinemayı konu alan doktora tezleri üzerine bibliyometrik bir çözümleme. Galatasaray Üniversitesi İletişim Dergisi, (21), 31-50. doi: https://doi.org/10.16878/gsuilet.96674
  • Jacso, P. (2005). As we may search--comparison of major features of the Web of Science, Scopus, and Google Scholar citation-based and citation-enhanced databases. Current Science, 89(9), 1537-1547.
  • Karadavut, T. (2019). The uniform prior for bayesian estimation of ability in item response theory models. International Journal of Assessment Tools in Education, 6(4), 568-579.
  • Kelecioğlu, H. (2024). Test eşitleme. Ankara: EPODDER.
  • Kothari, C.R. (2004). Research methodology: methods and techniques (2nd. ed.). New Delhi: New Age International Publishers.
  • Kousha, K., & Thelwall, M. (2007). Google Scholar citations and Google Web/URL citations: A multi-discipline exploratory analysis. Journal of the American Society for Information Science and Technology, 58(7), 1055-1065.
  • Köroğlu, S.A. (2015). Literatür taraması üzerine notlar ve bir tarama tekniği. GİDB Dergi (01), 61-69.
  • Li, K., Rollins, J., & Yan, E. (2018) Web of science use in published research and review papers 1997-2017: A selective, dynamic, cross-domain, content-based analysis. Scientometrics, 115, 1–20. doi: https://doi.org/10.1007/s11192-017-2622-5
  • Livingston, S.A. (2004). Equating test scores (without IRT). Educational Testing Service.
  • Medeiros, E.L.L., G. de Sá, P.G. Ambrosio, L.P. Guimarães Filho, V.M. Bristot, K. Madeira, C.K., Yamaguchi, F.C.S. Ferreira, and S.F. Stefenon. 2018. Three-parameter logistic model (ML3): A bibliometrics analysis. International Journal of Advanced Engineering Research and Science 5(4),128-134. doi: https://doi.org/10.22161/ijaers.5.4.18.
  • Meredith, W. (1993), Measurement invariance, factor analysis, and factorial invariance. Pyschometrika, 58, 525–543.
  • Moher, D., Liberati, A., Tetzlaff, J., Altman, D.G., & Prisma Group. (2009). Reprint-preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. Physical Therapy, 89(9), 873-880. doi: https://doi.org/10.1093/ptj/89.9.873
  • Morris, T.P., White, I.R., & Crowther, M.J. (2019). Using simulation studies to evaluate statistical methods. Tutorial in Biostatistics, 38(11), 2074-2102.
  • Ndwandwe, M., Bishop, D.G., Wise, R., & Rodseth, R. (2021). Bibliometrics to assess the productivity and impact of medical research. South African Journal of Higher Education 35(4), 224-36. doi: https://doi.org/10.20853/35-4-4341.
  • Nydick, S. (2022). CATIRT: Simulate IRT-Based Computerized Adaptive Tests [Dataset]. doi: https://doi.org/10.32614/cran.package.catirt
  • Partchev, I. (2022). irtoys: A collection of functions related to item response theory (IRT) [Computer software]. Available from https://CRAN.R-project.org/package=irtoys.
  • Ranganathan, K., & Foster, I. (2003). Simulation studies of computation and data scheduling algorithms for data grids. Journal of Grid Computing 1, 53–62. doi: https://doi.org/10.1023/A:1024035627870
  • Revelle, W. (2025). psych: Procedures for Psychological, Psychometric, and Personality Research [Dataset]. doi: https://doi.org/10.32614/cran.package.psych
  • Rizopoulos, D. (2006). ltm: An R package for latent variable modeling and item response analysis. Journal of Statistical Software, 17(5), 1–25. doi: https://doi.org/10.18637/jss.v017.i05
  • Selçuk, E. & Demir, E. (2023a). R yazılım dili paketleriyle üretilen simülatif verilerin madde tepki kuramı varsayımları açısından incelenmesi. 10. International Eurasian Educational Research Congress içinde (s. 177-195). Anı yayıncılık. https://drive.google.com/file/d/19JpqAb9eQU-SXHfSNhecGfplJmFmfiR5/view
  • Selçuk, E., & Demir, E. (2023b). Eğitimde ölçme ve değerlendirme alanında veri simülasyonuyla yapılan lisansüstü tezlerin tematik ve metodolojik açıdan incelenmesi. Internatıonal Educatıon Congress içinde (s. 709-719). Edu yayıncılık. https://educongress.org/wp-content/uploads/2023/10/EDUCONGRESS-2023CONFERENCE-PROCEEDING.pdf
  • Selçuk, E., & Demir, E. (2024). Comparison of item response theory ability and item parameters according to classical and Bayesian estimation methods. International Journal of Assessment Tools in Education, 11(2), 213-248. doi: https://doi.org/10.21449/ijate.1290831
  • van Rossum, G., & Drake, F.L. (2009). Python 3 Reference Manual. Python Software Foundation.
  • von Davier, A.A. (2011). Statistical models for test equating, scaling, and linking. USA: Springer.
  • Weiss, D.J. & Kingsbury, G.G. (1984). Application of computer adaptive testing to educational problems. Journal of Educational Measurement, 21(4), 361–375. https://doi.org/10.1111/j.1745-3984.1984.tb01040.x
  • Yurtçu, M., & Güzeller, C. (2021). Bibliometric analysis of articles on computerized adaptive testing. Participatory Educational Research, 8(4), 426-438. https://doi.org/10.17275/per.21.98.8.4
  • Zafrullah, Z., Bakti, A.A., Riantoro, E.S., Kastara, R., Prasetyo, Y.B.A., Rosidah, R., Fitriani, A., Fitria, R.L., Ramadhani, A.M., and Ulwiyah, S. (2023a). Item response theory in education: a biblioshiny analysis (1987-2023). Journal of Education Global, 1(1), 101–114.
  • Zaharia, M., Chowdhury, M., Franklin, M.J., Shenker, S., & Stoica, I. (2016). Spark: Cluster computing with working sets. Proceedings of the 2nd USENIX Conference on Hot Topics in Cloud Computing (HotCloud).
  • Zupic, I., & Cater, T. (2015). Bibliometric methods in management and organization. Organizational Research Methods, 18(3), 429-472. doi: https://doi.org/10.1177/1094428114562629
There are 61 citations in total.

Details

Primary Language Turkish
Subjects Measurement and Evaluation in Education (Other)
Journal Section Articles
Authors

Eray Selçuk 0000-0003-4033-4219

Ergül Demir 0000-0002-3708-8013

Publication Date April 30, 2025
Submission Date June 28, 2024
Acceptance Date April 10, 2025
Published in Issue Year 2025 Volume: 45 Issue: 1

Cite

APA Selçuk, E., & Demir, E. (2025). Madde Tepki Kuramı Alanında Yapılan Simülasyon Tabanlı Çalışmaların Bibliyometrik Analizi. Gazi Üniversitesi Gazi Eğitim Fakültesi Dergisi, 45(1), 229-278. https://doi.org/10.17152/gefad.1506639
AMA Selçuk E, Demir E. Madde Tepki Kuramı Alanında Yapılan Simülasyon Tabanlı Çalışmaların Bibliyometrik Analizi. GUJGEF. April 2025;45(1):229-278. doi:10.17152/gefad.1506639
Chicago Selçuk, Eray, and Ergül Demir. “Madde Tepki Kuramı Alanında Yapılan Simülasyon Tabanlı Çalışmaların Bibliyometrik Analizi”. Gazi Üniversitesi Gazi Eğitim Fakültesi Dergisi 45, no. 1 (April 2025): 229-78. https://doi.org/10.17152/gefad.1506639.
EndNote Selçuk E, Demir E (April 1, 2025) Madde Tepki Kuramı Alanında Yapılan Simülasyon Tabanlı Çalışmaların Bibliyometrik Analizi. Gazi Üniversitesi Gazi Eğitim Fakültesi Dergisi 45 1 229–278.
IEEE E. Selçuk and E. Demir, “Madde Tepki Kuramı Alanında Yapılan Simülasyon Tabanlı Çalışmaların Bibliyometrik Analizi”, GUJGEF, vol. 45, no. 1, pp. 229–278, 2025, doi: 10.17152/gefad.1506639.
ISNAD Selçuk, Eray - Demir, Ergül. “Madde Tepki Kuramı Alanında Yapılan Simülasyon Tabanlı Çalışmaların Bibliyometrik Analizi”. Gazi Üniversitesi Gazi Eğitim Fakültesi Dergisi 45/1 (April 2025), 229-278. https://doi.org/10.17152/gefad.1506639.
JAMA Selçuk E, Demir E. Madde Tepki Kuramı Alanında Yapılan Simülasyon Tabanlı Çalışmaların Bibliyometrik Analizi. GUJGEF. 2025;45:229–278.
MLA Selçuk, Eray and Ergül Demir. “Madde Tepki Kuramı Alanında Yapılan Simülasyon Tabanlı Çalışmaların Bibliyometrik Analizi”. Gazi Üniversitesi Gazi Eğitim Fakültesi Dergisi, vol. 45, no. 1, 2025, pp. 229-78, doi:10.17152/gefad.1506639.
Vancouver Selçuk E, Demir E. Madde Tepki Kuramı Alanında Yapılan Simülasyon Tabanlı Çalışmaların Bibliyometrik Analizi. GUJGEF. 2025;45(1):229-78.