Research Article
BibTex RIS Cite
Year 2022, Volume: 13 Issue: 1, 23 - 39, 29.03.2022
https://doi.org/10.21031/epod.999545

Abstract

References

  • Altuner, F. (2019). Examining the relationship between item statistics and item response time [Master’s Thesis, Mersin University]. Retrieved from http://tez2.yok.gov.tr/
  • Birgili, B. (2014). Open ended questions as an alternative to multiple choice: Dilemma in Turkish examination system [Master’s Thesis, Middle East Technical University]. Retrieved from http://tez2.yok.gov.tr/
  • Davison, M. L., Semmes, R., Huang, L., & Close, C. N. (2012). On the reliability and validity of a numerical reasoning speed dimension derived from response times collected in computerized testing. Educational and Psychological Measurement, 72(2), 245-263. https://doi.org/10.1177/0013164411408412
  • Fishbein, B., Foy, P., & Yin, L. (2021). TIMSS 2019 user guide for the international database. TIMSS & PIRLS International Study Center, Boston College.
  • Fraenkel, J. R., Wallen, N. E., & Hyun, H. H. (2012). How to design and evaluate research in education (8th ed.). McGraw-Hill.
  • Goldhammer, F., & Klein Entink, R. H. (2011). Speed of reasoning and its relation to reasoning ability. Intelligence, 39(2-3), 108-119. https://doi.org/10.1016/j.intell.2011.02.001
  • Goldhammer, F., Naumann, J., Stelter, A., Tóth, K., Rölke, H., & Klieme, E. (2014). The time on task effect in reading and problem solving is moderated by task difficulty and skill: Insights from a computer-based large-scale assessment. Journal of Educational Psychology, 106(3), 608-626. https://doi.org/10.1037/a0034716
  • Halkitis, P. N., Jones, J. P., & Pradhan, J. (1996, April 8-12). Estimating testing time: The effects of ıtem characteristics on response latency [Paper presentation]. Annual meeting of the American Educational Research Association, New York.
  • International Association for the Evaluation of Educational Achievement (IEA). (n.d.). Trends in international mathematics and science study: Data & tools. https://www.iea.nl/index.php/data-tools/repository/timss
  • İlgün-Dibek, M. (2020). Silent predictors of test disengagement in PIAAC 2012. Journal of Measurement and Evaluation in Education and Psychology, 11(4), 430-450. https://doi.org/10.21031/epod.796626
  • İlhan, M., Boztunç Öztürk, N., & Şahin, M. G. (2020). The effect of the item’s type and cognitive level on its difficulty index: The sample of TIMSS 2015. Participatory Educational Research, 7(2), 47-59. https://doi.org/10.17275/per.20.19.7.2
  • Kahraman, N., Cuddy, M. C., & Clauser, B. E. (2013). Modeling pacing behavior and test speededness using latent growth curve models. Applied Psychological Measurement, 37(5), 343–360. https://doi.org/10.1177/0146621613477236
  • Klein Entink, R. H., Fox, J.-P., & Van Der Linden, W. J. (2009). A multivariate multilevel approach to the modeling of accuracy and speed of test takers. Psychometrika, 74(1), 21–48. https://doi.org/10.1007/s11336-008-9075-y
  • Lasry, N., Watkins, J., Mazur, E., & Ibrahim, A. (2013). Response times to conceptual questions. American Journal of Physics, 81(9), 703-706. https://doi.org/10.1119/1.4812583
  • Lee, Y.-H., & Chen, H. (2011). A review of recent response-time analyses in educational testing. Psychological Test and Assessment Modeling, 53(3), 359–379.
  • Lee, Y.-H., & Haberman, S. J. (2016). Investigating test-taking behaviors using timing and process data, International Journal of Testing, 16(3), 240-267. https://doi.org/10.1080/15305058.2015.1085385
  • Lee, Y.-H., & Jia, Y. (2014). Using response time to investigate students’ test-taking behaviors in a NAEP computer-based study. Large-scale Assessments in Education, 2(1), 1–24. https://doi.org/10.1186/s40536-014-0008-1
  • Miller, M. D., Linn, R. L., & Gronlund, N. E. (2009). Measurement and assessment in teaching (10th ed.). Prentice Hall.
  • Ministry of National Education. (2020). TIMSS 2019 Türkiye ön raporu. Eğitim Analiz Değerlendirme Raporları Serisi, No: 15.
  • Molenaar, D., Tuerlinckx, F., & van der Maas, H. L. (2015). A bivariate generalized linear item response theory modeling framework to the analysis of responses and response times. Multivariate Behavioral Research, 50(1), 56–74. https://doi.org/10.1080/00273171.2014.962684
  • Mullis, I. V. S., Martin, M. O., Foy, P., Kelly, D., & Fishbein, B. (2020). TIMSS 2019 international results in mathematics and science. Boston College, TIMSS & PIRLS International Study Center.
  • Ruddock, G. J., O'Sullivan, C. Y., Arora, A., & Erberber, E. (2008). Developing the TIMSS 2007 mathematics and science assessments and scoring guides. In J. F. Olson, M. O. Martin & I. V. S. Mullis (Eds.), TIMSS 2007 technical report (pp. 13-44). International Study Center, Boston College.
  • Petscher, Y., Mitchell, A. M., & Foorman, B. R. (2015). Improving the reliability of student scores from speeded assessments: An illustration of conditional item response theory using a computer-administered measure of vocabulary. Reading and Writing, 28, 31–56. https://doi.org/10.1007/s11145-014-9518-z
  • Schnipke, D. L., & Scrams, D. J. (2002). Exploring issues of examinee behavior: Insights gained from response-time analyses. In N. C. Mills., M. T. Potenza, J. J. Fremer & C. W. Ward (Eds.), Computer-based testing: Building the foundation for future assessments (pp. 237-266). Psychology Press.
  • Su, S., & Davison, M. L. (2019). Improving the predictive validity of reading comprehension using response times of correct item responses. Applied Measurement in Education, 32(2), 166-182. https://doi.org/10.1080/08957347.2019.1577247
  • van der Linden, W. J., & Guo, F. (2008). Bayesian procedures for identifying aberrant response-time patterns in adaptive testing. Psychometrika, 73(3), 365–384. https://doi.org/10.1007/s11336-007-9045-8
  • Wise, S. L., & DeMars, C. E. (2010). Examinee noneffort and the validity of program assessment results. Educational Assessment, 15(1), 27-41. https://doi.org/10.1080/10627191003673216
  • Wise, S. L., & Kingsbury, G. G. (2016). Modeling student test-taking motivation in the context of an adaptive achievement test. Journal of Educational Measurement, 53(1), 86-105. https://doi.org/10.1111/jedm.12102
  • Yavuz, H. Ç. (2019). The effects of log data on students’ performance. Journal of Measurement and Evaluation in Education and Psychology, 10(4), 378-390. https://doi.org/10.21031/epod.564232

Examining Students' Item Response Times in eTIMSS According to their Proficiency Levels, Self-Confidence and Item Characteristics

Year 2022, Volume: 13 Issue: 1, 23 - 39, 29.03.2022
https://doi.org/10.21031/epod.999545

Abstract

The aim of this study was to examine whether the time spent answering science and mathematics items by Turkish students participating in the Trends in International Mathematics and Science Study (TIMSS) at the 8th grade level showed a significant difference according to their proficiency levels, self-confidence, and the item characteristics. This study was correlational research to explore the relationship between the variables discussed. A total of 577 students who participated in the TIMSS 2019 study at the 8th grade level in Turkey and answered the common 24 (11 mathematics and 13 science) items in Booklets 1 and 2 constituted the study participants. In the data analysis, the Kruskal Wallis-H test, Mann-Whitney U test, and Latent Class Analysis were used. As a result, it was determined that the type of item and cognitive level had a significant relation to the item response times of students. The students were found to spend more time on open-ended items than multiple-choice items. On the other hand, the time spent on items in the applying level was significantly higher than the knowledge level. However, there was no significant difference between the time spent answering items in the applying level and reasoning level. It was observed that if the students' confidence level in science was high, the rate of correct answers was high, and they answered the items in a short amount of time. Students who were somewhat self-confident in mathematics were more successful in difficult mathematics items and spent less time answering the items.

References

  • Altuner, F. (2019). Examining the relationship between item statistics and item response time [Master’s Thesis, Mersin University]. Retrieved from http://tez2.yok.gov.tr/
  • Birgili, B. (2014). Open ended questions as an alternative to multiple choice: Dilemma in Turkish examination system [Master’s Thesis, Middle East Technical University]. Retrieved from http://tez2.yok.gov.tr/
  • Davison, M. L., Semmes, R., Huang, L., & Close, C. N. (2012). On the reliability and validity of a numerical reasoning speed dimension derived from response times collected in computerized testing. Educational and Psychological Measurement, 72(2), 245-263. https://doi.org/10.1177/0013164411408412
  • Fishbein, B., Foy, P., & Yin, L. (2021). TIMSS 2019 user guide for the international database. TIMSS & PIRLS International Study Center, Boston College.
  • Fraenkel, J. R., Wallen, N. E., & Hyun, H. H. (2012). How to design and evaluate research in education (8th ed.). McGraw-Hill.
  • Goldhammer, F., & Klein Entink, R. H. (2011). Speed of reasoning and its relation to reasoning ability. Intelligence, 39(2-3), 108-119. https://doi.org/10.1016/j.intell.2011.02.001
  • Goldhammer, F., Naumann, J., Stelter, A., Tóth, K., Rölke, H., & Klieme, E. (2014). The time on task effect in reading and problem solving is moderated by task difficulty and skill: Insights from a computer-based large-scale assessment. Journal of Educational Psychology, 106(3), 608-626. https://doi.org/10.1037/a0034716
  • Halkitis, P. N., Jones, J. P., & Pradhan, J. (1996, April 8-12). Estimating testing time: The effects of ıtem characteristics on response latency [Paper presentation]. Annual meeting of the American Educational Research Association, New York.
  • International Association for the Evaluation of Educational Achievement (IEA). (n.d.). Trends in international mathematics and science study: Data & tools. https://www.iea.nl/index.php/data-tools/repository/timss
  • İlgün-Dibek, M. (2020). Silent predictors of test disengagement in PIAAC 2012. Journal of Measurement and Evaluation in Education and Psychology, 11(4), 430-450. https://doi.org/10.21031/epod.796626
  • İlhan, M., Boztunç Öztürk, N., & Şahin, M. G. (2020). The effect of the item’s type and cognitive level on its difficulty index: The sample of TIMSS 2015. Participatory Educational Research, 7(2), 47-59. https://doi.org/10.17275/per.20.19.7.2
  • Kahraman, N., Cuddy, M. C., & Clauser, B. E. (2013). Modeling pacing behavior and test speededness using latent growth curve models. Applied Psychological Measurement, 37(5), 343–360. https://doi.org/10.1177/0146621613477236
  • Klein Entink, R. H., Fox, J.-P., & Van Der Linden, W. J. (2009). A multivariate multilevel approach to the modeling of accuracy and speed of test takers. Psychometrika, 74(1), 21–48. https://doi.org/10.1007/s11336-008-9075-y
  • Lasry, N., Watkins, J., Mazur, E., & Ibrahim, A. (2013). Response times to conceptual questions. American Journal of Physics, 81(9), 703-706. https://doi.org/10.1119/1.4812583
  • Lee, Y.-H., & Chen, H. (2011). A review of recent response-time analyses in educational testing. Psychological Test and Assessment Modeling, 53(3), 359–379.
  • Lee, Y.-H., & Haberman, S. J. (2016). Investigating test-taking behaviors using timing and process data, International Journal of Testing, 16(3), 240-267. https://doi.org/10.1080/15305058.2015.1085385
  • Lee, Y.-H., & Jia, Y. (2014). Using response time to investigate students’ test-taking behaviors in a NAEP computer-based study. Large-scale Assessments in Education, 2(1), 1–24. https://doi.org/10.1186/s40536-014-0008-1
  • Miller, M. D., Linn, R. L., & Gronlund, N. E. (2009). Measurement and assessment in teaching (10th ed.). Prentice Hall.
  • Ministry of National Education. (2020). TIMSS 2019 Türkiye ön raporu. Eğitim Analiz Değerlendirme Raporları Serisi, No: 15.
  • Molenaar, D., Tuerlinckx, F., & van der Maas, H. L. (2015). A bivariate generalized linear item response theory modeling framework to the analysis of responses and response times. Multivariate Behavioral Research, 50(1), 56–74. https://doi.org/10.1080/00273171.2014.962684
  • Mullis, I. V. S., Martin, M. O., Foy, P., Kelly, D., & Fishbein, B. (2020). TIMSS 2019 international results in mathematics and science. Boston College, TIMSS & PIRLS International Study Center.
  • Ruddock, G. J., O'Sullivan, C. Y., Arora, A., & Erberber, E. (2008). Developing the TIMSS 2007 mathematics and science assessments and scoring guides. In J. F. Olson, M. O. Martin & I. V. S. Mullis (Eds.), TIMSS 2007 technical report (pp. 13-44). International Study Center, Boston College.
  • Petscher, Y., Mitchell, A. M., & Foorman, B. R. (2015). Improving the reliability of student scores from speeded assessments: An illustration of conditional item response theory using a computer-administered measure of vocabulary. Reading and Writing, 28, 31–56. https://doi.org/10.1007/s11145-014-9518-z
  • Schnipke, D. L., & Scrams, D. J. (2002). Exploring issues of examinee behavior: Insights gained from response-time analyses. In N. C. Mills., M. T. Potenza, J. J. Fremer & C. W. Ward (Eds.), Computer-based testing: Building the foundation for future assessments (pp. 237-266). Psychology Press.
  • Su, S., & Davison, M. L. (2019). Improving the predictive validity of reading comprehension using response times of correct item responses. Applied Measurement in Education, 32(2), 166-182. https://doi.org/10.1080/08957347.2019.1577247
  • van der Linden, W. J., & Guo, F. (2008). Bayesian procedures for identifying aberrant response-time patterns in adaptive testing. Psychometrika, 73(3), 365–384. https://doi.org/10.1007/s11336-007-9045-8
  • Wise, S. L., & DeMars, C. E. (2010). Examinee noneffort and the validity of program assessment results. Educational Assessment, 15(1), 27-41. https://doi.org/10.1080/10627191003673216
  • Wise, S. L., & Kingsbury, G. G. (2016). Modeling student test-taking motivation in the context of an adaptive achievement test. Journal of Educational Measurement, 53(1), 86-105. https://doi.org/10.1111/jedm.12102
  • Yavuz, H. Ç. (2019). The effects of log data on students’ performance. Journal of Measurement and Evaluation in Education and Psychology, 10(4), 378-390. https://doi.org/10.21031/epod.564232
There are 29 citations in total.

Details

Primary Language English
Journal Section Articles
Authors

Seher Yalçın 0000-0003-0177-6727

Publication Date March 29, 2022
Acceptance Date December 27, 2021
Published in Issue Year 2022 Volume: 13 Issue: 1

Cite

APA Yalçın, S. (2022). Examining Students’ Item Response Times in eTIMSS According to their Proficiency Levels, Self-Confidence and Item Characteristics. Journal of Measurement and Evaluation in Education and Psychology, 13(1), 23-39. https://doi.org/10.21031/epod.999545