Araştırma Makalesi
BibTex RIS Kaynak Göster

Matematik Başarısında Grup Farklılıklarının İncelenmesi: Açıklayıcı Madde Tepki Modeli Uygulaması

Yıl 2023, Cilt: 20 Sayı: 53, 385 - 395, 30.05.2023
https://doi.org/10.26466/opusjsr.1226914

Öz

Öğrenciler eğitim hayatları boyunca birçok farklı sınava katılmaktadır. Bu sınavlarda, çeşitli birey ve madde özellikleri öğrencilerin maddelere verdikleri yanıtları etkileyebilmektedir. Bu çalışmada 9.sınıflarda öğrenim gören 365 öğrencinin matematik dersi ortak sınav sonuçları üzerinde birey ve madde yordayıcılarının etkisinin açıklayıcı madde tepki modelleri ile incelenmesi amaçlanmıştır. Alanyazında araştırmalara yaygın olarak dahil edilmesi sebebiyle; birey değişkeni olarak cinsiyet ve okul türü, madde değişkenleri olarak ise bilişsel alan, içerik alanı ve kitapçık türü değişkenleri modellere eklenmiştir. Kestirilen madde parametreleri incelendiğinde, Rasch modeli ile tüm maddeler için en küçük parametre değerlerinin elde edildiği görülmüştür. Dört farklı modelin model veri uyumu değerleri incelendiğinde ise örtük regresyon ve örtük regresyon doğrusal lojistik test modellerinin Rasch modeline göre daha iyi uyum gösterdiği sonucuna ulaşılmıştır. Birey ve madde yordayıcıları modele eklenerek her bir değişken grubu için elde edilen parametreler karşılaştırılmış ve okul türü, bilişsel alan, içerik alanı değişkenleri için gruplar arasında farklılıklar gözlenmiştir. Cinsiyet ve kitapçık türü değişkenleri için ise madde parametrelerinin farklılaşmadığı sonucuna ulaşılmıştır. Bu modellerin, kestirilen parametrelerdeki farklılıkların nedenlerine ilişkin daha detaylı bilgiler sunması sebebiyle eğitim ve psikoloji alanında yapılacak çalışmalarda daha yaygın kullanılmasının faydalı olacağı düşünülmektedir.

Kaynakça

  • Atar, B. (2011). Tanımlayıcı ve açıklayıcı madde tepki modellerinin TIMSS 2007 Türkiye matematik verisine uyarlanması. Eğitim ve Bilim, 36(159).
  • Atar, B., & Aktan, D. Ç. (2013). Birey açıklayıcı madde tepki kuramı analizi: örtük regresyon iki parametreli lojistik modeli. Eğitim ve Bilim, 38(168).
  • Baker, F. B. (2001). The basics of item response theory. http://ericae. net/irt/baker. Berberoğlu G. ve Kalender İ. (2005). Öğrenci Başarısının Yıllara, Okul Türlerine, Bölgelere Göre İncelenmesi: ÖSS ve PISA Analizi, ODTÜ Eğitim Bilimleri ve Uygulama Dergisi, Sayfa 27-28.
  • Blozis, S. A., Conger K. J., & Harring, J. R. (2007). Nonlinear latent curve models for multivariate longitudinal data. International Journal of Behavioral Development: Special Issue on Longitudinal Modeling of Developmental Processes, 31, 340-346
  • Boeck, P. de, Cho, S. J., & Wilson, M. (2011). Explanatory secondary dimension modeling of latent differential item functioning. Applied Psychological Measurement, 35(8), 583–603.
  • Boeck, P. de., &Wilson, M. (2004). Explanatory item response models. New York, NY: Springer New York.
  • Briggs, D. C. (2008). Using explanatory item response models to analyze group differences in science achievement. Applied Measurement in Education, 21(2), 89–118.
  • Bulut, O. (2021). eirm: Explanatory item response modeling for dichotomous and polytomous item responses, R package version 0.4. doi: 10.5281/zenodo.4556285 Available from https://CRAN.R-project.org/package=eirm.
  • Bulut, O., Palma, J., Rodriguez, M. C., & Stanke, L. (2015). Evaluating measurement invariance in the measurement of developmental assets in Latino English language groups across developmental stages. Sage Open, 5(2), 2158244015586238.
  • Büyükkıdık, S., & Bulut, O. (2022). Analyzing the Effects of Test, Student, and School Predictors on Science Achievement: An Explanatory IRT Modeling Approach. Journal of Measurement and Evaluation in Education and Psychology, 13(1), 40-53.
  • Cheema, J. R., & Galluzzo, G. (2013). Analyzing the gender gap in math achievement: Evidence from a large-scale US sample. Research in Education, 90(1), 98-112.
  • Chen, F., Yang, H., Bulut, O., Cui, Y., & Xin, T. (2019). Examining the relation of personality factors to substance use disorder by explanatory item response modeling of DSM-5 symptoms. PloS One, 14(6), e0217630. https://doi.org/10.1371/journal.pone.0217630
  • Chen, W. H., & Thissen, D. (1997). Local dependence indexes for item pairs using item response theory. Journal of Educational and Behavioral Statistics, 22(3), 265-289.
  • Chiu, T. (2016). Using Explanatory Item Response Models to Evaluate Complex Scientific Tasks Designed for the Next Generation Science Standards (Doctoral dissertation, UC Berkeley).
  • Crocker, L., & Algina, J. (1986). Introduction to classical and modern test theory. Holt, Rinehart and Winston, 6277 Sea Harbor Drive, Orlando, FL 32887.
  • De Ayala, R. J. (2013). The theory and practice of item response theory. Guilford Publications.
  • De Ayala, R. J. (2022). The theory and practice of item response theory, Second Edition. Guilford Publications.
  • DeMars, C. (2010). Item response theory. Oxford University Press.
  • Desjardins, C. D., & Bulut, O. (2018). Handbook of educational measurement and psychometrics using R. CRC Press.
  • Ellison, G., & Swanson, A. (2018). Dynamics of the gender gap in high math achievement (No. w24910). National Bureau of Economic Research.
  • Embretson, S. E., & Reise, S. P. (2000). Item response theory for psychologists. Maheah.
  • Fleiss,J.L.(1971) "Measuring nominal scale agreement among many raters." Psychological Bulletin, Cilt 76, Sayi 5 say. 378-382
  • Fraenkel, J. R., Wallen, N. E., & Hyun, H. H. (2012). How to design and evaluate research in education (8th edt.). New York: McGram-Hill Companies.
  • Hambleton, R. K., Swaminathan, H., & Rogers, H. J. (1991). Fundamentals of item response theory. Measurement methods for the social sciences series. Newbury Park, Calif.: Sage Publications.
  • Kahraman, N. (2014). An explanatory item response theory approach for a computer-based case simulation test. Eurasian Journal of Educational Research, 14(54), 117–134. https://doi.org/10.14689/ejer.2014.54.7
  • Kim, J., & Wilson, M. (2020). Polytomous item explanatory item response theory models. Educational and Psychological Measurement, 80(4), 726-755.
  • Landis, J. R. ve Koch, G. G. (1977) "The measurement of observer agreement for categorical data", Biometrics. Cilt. 33, say. 159-174
  • Kaiser, H. F. (1960). The application of electronic computers to factor analysis. Educational and Psychological Measurement, 20(1), 141–151.
  • Min, H., Zickar, M., & Yankov, G. (2018). Understanding item parameters in personality scales: An explanatory item response modeling approach. Personality and Individual Differences, 128, 1–6. https://doi.org/10.1016/j.paid.2018.02.012
  • Orlando, M., & Thissen, D. (2000). Likelihood-based item-fit indices for dichotomous Item Response Theory models. Applied Psychological Measurement, 24(1), 24-50
  • Petscher, Y., Compton, D. L., Steacy, L., & Kinnon, H. (2020). Past perspectives and new opportunities for the explanatory item response model. Annals of Dyslexia, 70(2), 160-179.
  • Randall, J., Cheong, Y. F., & Engelhard, G. (2010). Using explanatory item response theory modeling to investigate context effects of differential item functioning for students with disabilities. Educational and Psychological Measurement, 71(1), 129–147.
  • Sijtsma, K. (2020). Measurement models for psychological attributes: Classical test theory, factor analysis, item response theory, and latent class models. CRC Press.
  • Tat, O. (2020). Açıklayıcı Madde Tepki Modellerinin Bilgisayar Ortamında Bireye Uyarlanmış Testlerde Kullanımı. [Doktora Tezi]. Hacettepe Üniversitesi, Eğitim Bilimleri Enstitüsü, Ankara.
  • Yavuz, H. C. (2019). The effects of log data on students’ performance. Journal of Measurement and Evaluation in Education and Psychology, 10(4), 378-390.
  • Yen, W. M. (1981). Using simulation results to choose a latent trait model. Applied Psychological Measurement, 5, 245–262.
  • Yücel, Z., & Koç, M. (2011). İlköğretim öğrencilerinin matematik dersine karşı tutumlarının başarı düzeylerini yordama gücü ile cinsiyet arasındaki ilişki. İlköğretim Online, 10(1), 133-143.

Examining Group Differences in Mathematics Achievement: Explanatory Item Response Model Application

Yıl 2023, Cilt: 20 Sayı: 53, 385 - 395, 30.05.2023
https://doi.org/10.26466/opusjsr.1226914

Öz

Students take many different exams throughout their educational lives. In these exams, various individual and item characteristics can affect the responses of individuals to the items. In this study, it was aimed to examine the effects of person and item predictors on the mathematics common exam results of 365 9th grade students with explanatory item response models. Gender and school type as person variables and cognitive domain, content domain and booklet type as item variables were added to the models due to their widespread inclusion in the literature. When the predicted item parameters were examined, it was seen that the smallest parameter values were obtained for all items with the Rasch model. When the model data fit values of four different models were examined, it was concluded that the latent regression and latent regression linear logistic test models showed better fit than the Rasch model. By adding person and item predictors to the model, the parameters obtained for each variable group were compared, and differences were observed between the groups for school type, cognitive domain, and content domain variables. It was concluded that the item parameters did not differ for the variables of gender and booklet type. It is thought that it would be beneficial to use these models more widely in studies to be conducted in the field of education and psychology since they provide more detailed information about the reasons for the differences in the estimated parameters.

Kaynakça

  • Atar, B. (2011). Tanımlayıcı ve açıklayıcı madde tepki modellerinin TIMSS 2007 Türkiye matematik verisine uyarlanması. Eğitim ve Bilim, 36(159).
  • Atar, B., & Aktan, D. Ç. (2013). Birey açıklayıcı madde tepki kuramı analizi: örtük regresyon iki parametreli lojistik modeli. Eğitim ve Bilim, 38(168).
  • Baker, F. B. (2001). The basics of item response theory. http://ericae. net/irt/baker. Berberoğlu G. ve Kalender İ. (2005). Öğrenci Başarısının Yıllara, Okul Türlerine, Bölgelere Göre İncelenmesi: ÖSS ve PISA Analizi, ODTÜ Eğitim Bilimleri ve Uygulama Dergisi, Sayfa 27-28.
  • Blozis, S. A., Conger K. J., & Harring, J. R. (2007). Nonlinear latent curve models for multivariate longitudinal data. International Journal of Behavioral Development: Special Issue on Longitudinal Modeling of Developmental Processes, 31, 340-346
  • Boeck, P. de, Cho, S. J., & Wilson, M. (2011). Explanatory secondary dimension modeling of latent differential item functioning. Applied Psychological Measurement, 35(8), 583–603.
  • Boeck, P. de., &Wilson, M. (2004). Explanatory item response models. New York, NY: Springer New York.
  • Briggs, D. C. (2008). Using explanatory item response models to analyze group differences in science achievement. Applied Measurement in Education, 21(2), 89–118.
  • Bulut, O. (2021). eirm: Explanatory item response modeling for dichotomous and polytomous item responses, R package version 0.4. doi: 10.5281/zenodo.4556285 Available from https://CRAN.R-project.org/package=eirm.
  • Bulut, O., Palma, J., Rodriguez, M. C., & Stanke, L. (2015). Evaluating measurement invariance in the measurement of developmental assets in Latino English language groups across developmental stages. Sage Open, 5(2), 2158244015586238.
  • Büyükkıdık, S., & Bulut, O. (2022). Analyzing the Effects of Test, Student, and School Predictors on Science Achievement: An Explanatory IRT Modeling Approach. Journal of Measurement and Evaluation in Education and Psychology, 13(1), 40-53.
  • Cheema, J. R., & Galluzzo, G. (2013). Analyzing the gender gap in math achievement: Evidence from a large-scale US sample. Research in Education, 90(1), 98-112.
  • Chen, F., Yang, H., Bulut, O., Cui, Y., & Xin, T. (2019). Examining the relation of personality factors to substance use disorder by explanatory item response modeling of DSM-5 symptoms. PloS One, 14(6), e0217630. https://doi.org/10.1371/journal.pone.0217630
  • Chen, W. H., & Thissen, D. (1997). Local dependence indexes for item pairs using item response theory. Journal of Educational and Behavioral Statistics, 22(3), 265-289.
  • Chiu, T. (2016). Using Explanatory Item Response Models to Evaluate Complex Scientific Tasks Designed for the Next Generation Science Standards (Doctoral dissertation, UC Berkeley).
  • Crocker, L., & Algina, J. (1986). Introduction to classical and modern test theory. Holt, Rinehart and Winston, 6277 Sea Harbor Drive, Orlando, FL 32887.
  • De Ayala, R. J. (2013). The theory and practice of item response theory. Guilford Publications.
  • De Ayala, R. J. (2022). The theory and practice of item response theory, Second Edition. Guilford Publications.
  • DeMars, C. (2010). Item response theory. Oxford University Press.
  • Desjardins, C. D., & Bulut, O. (2018). Handbook of educational measurement and psychometrics using R. CRC Press.
  • Ellison, G., & Swanson, A. (2018). Dynamics of the gender gap in high math achievement (No. w24910). National Bureau of Economic Research.
  • Embretson, S. E., & Reise, S. P. (2000). Item response theory for psychologists. Maheah.
  • Fleiss,J.L.(1971) "Measuring nominal scale agreement among many raters." Psychological Bulletin, Cilt 76, Sayi 5 say. 378-382
  • Fraenkel, J. R., Wallen, N. E., & Hyun, H. H. (2012). How to design and evaluate research in education (8th edt.). New York: McGram-Hill Companies.
  • Hambleton, R. K., Swaminathan, H., & Rogers, H. J. (1991). Fundamentals of item response theory. Measurement methods for the social sciences series. Newbury Park, Calif.: Sage Publications.
  • Kahraman, N. (2014). An explanatory item response theory approach for a computer-based case simulation test. Eurasian Journal of Educational Research, 14(54), 117–134. https://doi.org/10.14689/ejer.2014.54.7
  • Kim, J., & Wilson, M. (2020). Polytomous item explanatory item response theory models. Educational and Psychological Measurement, 80(4), 726-755.
  • Landis, J. R. ve Koch, G. G. (1977) "The measurement of observer agreement for categorical data", Biometrics. Cilt. 33, say. 159-174
  • Kaiser, H. F. (1960). The application of electronic computers to factor analysis. Educational and Psychological Measurement, 20(1), 141–151.
  • Min, H., Zickar, M., & Yankov, G. (2018). Understanding item parameters in personality scales: An explanatory item response modeling approach. Personality and Individual Differences, 128, 1–6. https://doi.org/10.1016/j.paid.2018.02.012
  • Orlando, M., & Thissen, D. (2000). Likelihood-based item-fit indices for dichotomous Item Response Theory models. Applied Psychological Measurement, 24(1), 24-50
  • Petscher, Y., Compton, D. L., Steacy, L., & Kinnon, H. (2020). Past perspectives and new opportunities for the explanatory item response model. Annals of Dyslexia, 70(2), 160-179.
  • Randall, J., Cheong, Y. F., & Engelhard, G. (2010). Using explanatory item response theory modeling to investigate context effects of differential item functioning for students with disabilities. Educational and Psychological Measurement, 71(1), 129–147.
  • Sijtsma, K. (2020). Measurement models for psychological attributes: Classical test theory, factor analysis, item response theory, and latent class models. CRC Press.
  • Tat, O. (2020). Açıklayıcı Madde Tepki Modellerinin Bilgisayar Ortamında Bireye Uyarlanmış Testlerde Kullanımı. [Doktora Tezi]. Hacettepe Üniversitesi, Eğitim Bilimleri Enstitüsü, Ankara.
  • Yavuz, H. C. (2019). The effects of log data on students’ performance. Journal of Measurement and Evaluation in Education and Psychology, 10(4), 378-390.
  • Yen, W. M. (1981). Using simulation results to choose a latent trait model. Applied Psychological Measurement, 5, 245–262.
  • Yücel, Z., & Koç, M. (2011). İlköğretim öğrencilerinin matematik dersine karşı tutumlarının başarı düzeylerini yordama gücü ile cinsiyet arasındaki ilişki. İlköğretim Online, 10(1), 133-143.
Toplam 37 adet kaynakça vardır.

Ayrıntılar

Birincil Dil İngilizce
Bölüm Research Articles
Yazarlar

Erdem Boduroğlu

Duygu Anıl 0000-0002-1745-4071

Erken Görünüm Tarihi 31 Mayıs 2023
Yayımlanma Tarihi 30 Mayıs 2023
Yayımlandığı Sayı Yıl 2023 Cilt: 20 Sayı: 53

Kaynak Göster

APA Boduroğlu, E., & Anıl, D. (2023). Examining Group Differences in Mathematics Achievement: Explanatory Item Response Model Application. OPUS Journal of Society Research, 20(53), 385-395. https://doi.org/10.26466/opusjsr.1226914