Research Article
BibTex RIS Cite

Examining Scale Items in Terms of Method Effects Based on the Bifactor Item Response Theory Model

Year 2021, Volume: 29 Issue: 1, 201 - 209, 16.01.2021

Abstract

The current study aims to apply a one-dimensional (the graded response model) and a multidimensional (the bifactor model) item response theory model to evaluate the presence of method effects on the data obtaining from the administration of the scale measuring emotional school engagement of students as a part of 2015 Programme for International Student Assessment (PISA). The model data fits of the unidimensional and bifactor model conceptualizations were compared on the data obtained from a large sample of Turkey school children. The item parameters and model data fit statistics provided evidences for that the school engagement scale of the PISA 2015 measures a primary construct (the emotional school engagement) with two nuisance factors (the negative and positive wording effects). The results of the present study supported the use of bifactor model in evaluating the presence of method effects. Therefore, researchers using the Emotional School Engagement Scale of PISA are recommended to utilize more sophisticated statistical techniques such as the bifactor item response theory model.

References

  • Appleton, J. J., Christenson, S. L., Kim, D., & Reschly, A. L. (2006). Measuring cognitive and psychological engagement: Validation of the Student Engagement Instrument. Journal of School Psychology, 44, 427-445.
  • Archambault, I., Janosz, M., Fallu, J. S., & Pagani, L. S. (2009). Student engagement and its relationship with early high school dropout. Journal of Adolescence, 32, 651-670.
  • Baker, F. (2001). The basics of item response theory. University of Maryland: College Park: ERIC Clearinghouse on Assessment and Evaluation.
  • Barnette, J. J. (2000). Effects of stem and Likert response option reversals on survey internal consistency: If you feel the need, there is a better alternative to using those negatively worded stems. Educational and Psychological Measurement, 60(3), 361-370.
  • Brown, T. A. (2006). Confirmatory factor analysis for applied research. New York: The Guilford Press.
  • Chen, F. F., West, S. G., & Sousa, K. H. (2006). A comparison of bifactor and second-order models of quality of life. Multivariate Behavioral Research, 41(2), 189-225.
  • Cole, K. L. M., Turner, R. C., Gitchel, W. D. (2018). A study of reverse-worded matched item pairs using the generalized partial credit and nominal response models. Educational and Psychological Measurement, 78(1), 103-127.
  • Craft, A. M., & Capraro, R. M. (2017). Science, technology, engineering, and mathematics project-based learning: Merging rigor and relevance to increase student engagement. Electronic International Journal of Education, Arts, and Science, 3(6), 140-158.
  • De Ayala, R. J. (2009). The theory and practice of item response theory. New York: The Guilford Press.
  • DeMars, C. (2010). Item response theory. New York: Oxford University Press, Inc.
  • DeMars, C. E. (2006). Application of the bi-factor multidimensional item response theory model to testlet-based tests. Journal of Educational Measurement, 43(2), 145-168.
  • DiStefano, C., & Motl, R. W. (2006) Further investigating method effects associated with negatively worded items on self-report surveys. Structural Equation Modeling, 13(3), 440-464.
  • DiStefano, C., & Motl, R. W. (2009). Personality correlates of method effects due to negatively worded items on the Rosenberg Self-Esteem scale. Personality and Individual Differences, 46, 309-313.
  • Embretson, S. E., & Reise, S.P.(2000). Item response theory for psychologists. New Jersey: Lawrence Erlbaum Associate, Inc.
  • Fletcher, D., & Hattie, J. A. (2004). An examination of the psychometric properties of the physical self-description questionnaire using a polytomous item response model. Psychology of Sport and Exercise, 5, 423-446.
  • Fredericks, J. A., Blumenfield, P.C., & Paris, A.H. (2004). School engagement: Potential of the concept, state of the evidence. Review of Educational Research, 74(1), 59-109.
  • Gu, H., Wen, Z., & Fan, X. (2015). The impact of wording effect on reliability and validity of the Core Self-Evaluation Scale (CSES): A bi-factor perspective. Personality and Individual Differences, 83, 142-147.
  • Horan, P. M., DiStefano, C., & Motl, R. W. (2003) Wording effects in self-esteem scales: Methodological artifact or response style? Structural Equation Modeling, 10(3), 435-455.
  • Hyland, P., Boduszek, D., Dhingra, K., Shevlin, M., & Egan, A. (2014). A bifactor approach to modelling the Rosenberg Self Esteem Scale. Personality and Individual Differences, 66, 188-192.
  • Immekus, J., & Imbrie, P. K. (2008). Dimensionality assessment using the full information item bifactor analysis for graded response data an illustration with the State Metacognitive Inventory. Educational and Psychological Measurement, 68(4), 695-709.
  • Karasar, N. (2005). Bilimsel araştırma yöntemi: Kavramlar, ilkeler, teknikler. Ankara: Nobel Yayın Dağıtım.
  • Kumar, A., & Dillon, W. R. (1992). An integrative look at the use of additive and multiplicative covariance structure models in the analysis of MTMM data. Journal of Marketing Research, 29(1), 51-64.
  • Li, Y., Jiao, H., & Lissitz, R. W. (2012). Applying multidimensional item response theory models in validating test dimensionality: An example of K–12 large scale science assessment. Journal of Applied Testing Technology, 13(2), 2-27.
  • McKay, M. T., Boduszek, D., & Harvey, S. A. (2014). The Rosenberg Self-Esteem Scale: A bifactor answer to a two-factor ques-tion? Journal of Personality Assessment, 96(6), 654-660.
  • Min, H., Zickar, M., & Yankov, G. (2018). Understanding item parameters in personality scales: An explanatory item response modeling approach. Personality and Individual Differences, 128, 1-6.
  • Organisation for Economic Co-operation and Development (2013). PISA 2012 results: Ready to learn students’ engagement, drive and self-beliefs (Volume III). Retrieved from https://www.oecd.org/pisa/keyfindings/PISA-2012-results-volume-III.pdf
  • Paulhus, D. L. (1991). Measurement and control of response bias. In J. P. Robinson, P. R. Shaver, & L. S. Wrightsman (Eds.), Measures of personality and social psychological attitudes (Vol. 1, pp. 17-59). San Diego, CA: Academic Press.
  • Ray, J. V., Frick, P. J., Thornton, L. C., Steinberg, L., & Cauffman, E. (2016). Positive and negative item wording and its influence on the assessment of Callous Unemotional Traits. Psychological Assessment, 28(4), 394-404.
  • Reise, S. P. (2012). The rediscovery of bifactor measurement models. Multivariate Behavioral Research, 47, 667-696.
  • Reise, S. P., Ventura, J., Keefe, R. S. E., Baade, L. E., Gold, J. M., Green, M. F., … Bilder, R. (2011). Bifactor and Item Response Theory Analyses of Interviewer Report Scales of Cognitive Impairment in Schizophrenia. Psychol Assess., 23(1), 245-261.
  • Suárez-Álvarez, J., Pedrosa, I., Lozano, L. M., García-Cueto, E., Cuesta, M., & Muñiz, J. (2018). Using reversed items in likert scales: A questionable practice. Psicothema, 30(2), 149-158.
  • Supple, A. J, & Plunkett, S. W. (2011). Dimensionality and validity of the Rosenberg Self-Esteem Scale for use with Latino adolescents. Hispanic Journal of Behavioral Sciences, 33(1), 39-53.
  • Thissen, D., & Wainer, H. (2001). Test scoring. NJ: Lawrence Erlbaum Associates, Inc.
  • Tomas, J. M., & Oliver, A. (1999) Rosenberg's self‐esteem scale: Two factors or method effects. Structural Equation Modeling: A Multidisciplinary Journal, 6(1), 84-98.
  • Wang , J., Siegal, H. A., Falck, R. S., & Carlson, R. G. (2001) Factorial structure of Rosenberg's Self-Esteem Scale among crack-cocaine drug users. Structural Equation Modeling, 8(2), 275-286.
  • Wang, Y., Kim, E. U., Dedrick, R. F., Ferron, J. M., & Tan, T. (2018). A multilevel bifactor approach to construct validation of mixed-format scales. Educational and Psychological Measurement, 78(2), 253-271.
  • Weems , G. H., Onwuegbuzie, A. J., Schreiber, J. B., Eggers, S. J. (2003). Characteristics of respondents who respond differently to positively and negatively worded items on rating scales. Assessment & Evaluation in Higher Education, 28(6), 588-607.
  • Weijters, B., & Baumgartner, H. (2012). Misresponse to reversed and negated items in surveys: A review. Journal of Marketing Research, 59(5), 737-747.
  • Wells, C., & Wollack, J. A. (2003). An instructor’s guide to understanding test reliability. Madison: University of Wisconsin, Testing & Evaluation Services.
  • Wouters E, Booysen F. L. R., Ponnet K, Baron Van Loon F. (2012). Wording effects and the factor structure of the Hospital Anxiety & Depression Scale in HIV/AIDS patients on antiretroviral treatment in South Africa. PLoS ONE, 7(4), 1-10.
  • Ye, S. (2009). Factor structure of the General Health Questionnaire (GHQ-12): The role of wording effects. Personality and Individual Differences, 46, 197-201.
Year 2021, Volume: 29 Issue: 1, 201 - 209, 16.01.2021

Abstract

References

  • Appleton, J. J., Christenson, S. L., Kim, D., & Reschly, A. L. (2006). Measuring cognitive and psychological engagement: Validation of the Student Engagement Instrument. Journal of School Psychology, 44, 427-445.
  • Archambault, I., Janosz, M., Fallu, J. S., & Pagani, L. S. (2009). Student engagement and its relationship with early high school dropout. Journal of Adolescence, 32, 651-670.
  • Baker, F. (2001). The basics of item response theory. University of Maryland: College Park: ERIC Clearinghouse on Assessment and Evaluation.
  • Barnette, J. J. (2000). Effects of stem and Likert response option reversals on survey internal consistency: If you feel the need, there is a better alternative to using those negatively worded stems. Educational and Psychological Measurement, 60(3), 361-370.
  • Brown, T. A. (2006). Confirmatory factor analysis for applied research. New York: The Guilford Press.
  • Chen, F. F., West, S. G., & Sousa, K. H. (2006). A comparison of bifactor and second-order models of quality of life. Multivariate Behavioral Research, 41(2), 189-225.
  • Cole, K. L. M., Turner, R. C., Gitchel, W. D. (2018). A study of reverse-worded matched item pairs using the generalized partial credit and nominal response models. Educational and Psychological Measurement, 78(1), 103-127.
  • Craft, A. M., & Capraro, R. M. (2017). Science, technology, engineering, and mathematics project-based learning: Merging rigor and relevance to increase student engagement. Electronic International Journal of Education, Arts, and Science, 3(6), 140-158.
  • De Ayala, R. J. (2009). The theory and practice of item response theory. New York: The Guilford Press.
  • DeMars, C. (2010). Item response theory. New York: Oxford University Press, Inc.
  • DeMars, C. E. (2006). Application of the bi-factor multidimensional item response theory model to testlet-based tests. Journal of Educational Measurement, 43(2), 145-168.
  • DiStefano, C., & Motl, R. W. (2006) Further investigating method effects associated with negatively worded items on self-report surveys. Structural Equation Modeling, 13(3), 440-464.
  • DiStefano, C., & Motl, R. W. (2009). Personality correlates of method effects due to negatively worded items on the Rosenberg Self-Esteem scale. Personality and Individual Differences, 46, 309-313.
  • Embretson, S. E., & Reise, S.P.(2000). Item response theory for psychologists. New Jersey: Lawrence Erlbaum Associate, Inc.
  • Fletcher, D., & Hattie, J. A. (2004). An examination of the psychometric properties of the physical self-description questionnaire using a polytomous item response model. Psychology of Sport and Exercise, 5, 423-446.
  • Fredericks, J. A., Blumenfield, P.C., & Paris, A.H. (2004). School engagement: Potential of the concept, state of the evidence. Review of Educational Research, 74(1), 59-109.
  • Gu, H., Wen, Z., & Fan, X. (2015). The impact of wording effect on reliability and validity of the Core Self-Evaluation Scale (CSES): A bi-factor perspective. Personality and Individual Differences, 83, 142-147.
  • Horan, P. M., DiStefano, C., & Motl, R. W. (2003) Wording effects in self-esteem scales: Methodological artifact or response style? Structural Equation Modeling, 10(3), 435-455.
  • Hyland, P., Boduszek, D., Dhingra, K., Shevlin, M., & Egan, A. (2014). A bifactor approach to modelling the Rosenberg Self Esteem Scale. Personality and Individual Differences, 66, 188-192.
  • Immekus, J., & Imbrie, P. K. (2008). Dimensionality assessment using the full information item bifactor analysis for graded response data an illustration with the State Metacognitive Inventory. Educational and Psychological Measurement, 68(4), 695-709.
  • Karasar, N. (2005). Bilimsel araştırma yöntemi: Kavramlar, ilkeler, teknikler. Ankara: Nobel Yayın Dağıtım.
  • Kumar, A., & Dillon, W. R. (1992). An integrative look at the use of additive and multiplicative covariance structure models in the analysis of MTMM data. Journal of Marketing Research, 29(1), 51-64.
  • Li, Y., Jiao, H., & Lissitz, R. W. (2012). Applying multidimensional item response theory models in validating test dimensionality: An example of K–12 large scale science assessment. Journal of Applied Testing Technology, 13(2), 2-27.
  • McKay, M. T., Boduszek, D., & Harvey, S. A. (2014). The Rosenberg Self-Esteem Scale: A bifactor answer to a two-factor ques-tion? Journal of Personality Assessment, 96(6), 654-660.
  • Min, H., Zickar, M., & Yankov, G. (2018). Understanding item parameters in personality scales: An explanatory item response modeling approach. Personality and Individual Differences, 128, 1-6.
  • Organisation for Economic Co-operation and Development (2013). PISA 2012 results: Ready to learn students’ engagement, drive and self-beliefs (Volume III). Retrieved from https://www.oecd.org/pisa/keyfindings/PISA-2012-results-volume-III.pdf
  • Paulhus, D. L. (1991). Measurement and control of response bias. In J. P. Robinson, P. R. Shaver, & L. S. Wrightsman (Eds.), Measures of personality and social psychological attitudes (Vol. 1, pp. 17-59). San Diego, CA: Academic Press.
  • Ray, J. V., Frick, P. J., Thornton, L. C., Steinberg, L., & Cauffman, E. (2016). Positive and negative item wording and its influence on the assessment of Callous Unemotional Traits. Psychological Assessment, 28(4), 394-404.
  • Reise, S. P. (2012). The rediscovery of bifactor measurement models. Multivariate Behavioral Research, 47, 667-696.
  • Reise, S. P., Ventura, J., Keefe, R. S. E., Baade, L. E., Gold, J. M., Green, M. F., … Bilder, R. (2011). Bifactor and Item Response Theory Analyses of Interviewer Report Scales of Cognitive Impairment in Schizophrenia. Psychol Assess., 23(1), 245-261.
  • Suárez-Álvarez, J., Pedrosa, I., Lozano, L. M., García-Cueto, E., Cuesta, M., & Muñiz, J. (2018). Using reversed items in likert scales: A questionable practice. Psicothema, 30(2), 149-158.
  • Supple, A. J, & Plunkett, S. W. (2011). Dimensionality and validity of the Rosenberg Self-Esteem Scale for use with Latino adolescents. Hispanic Journal of Behavioral Sciences, 33(1), 39-53.
  • Thissen, D., & Wainer, H. (2001). Test scoring. NJ: Lawrence Erlbaum Associates, Inc.
  • Tomas, J. M., & Oliver, A. (1999) Rosenberg's self‐esteem scale: Two factors or method effects. Structural Equation Modeling: A Multidisciplinary Journal, 6(1), 84-98.
  • Wang , J., Siegal, H. A., Falck, R. S., & Carlson, R. G. (2001) Factorial structure of Rosenberg's Self-Esteem Scale among crack-cocaine drug users. Structural Equation Modeling, 8(2), 275-286.
  • Wang, Y., Kim, E. U., Dedrick, R. F., Ferron, J. M., & Tan, T. (2018). A multilevel bifactor approach to construct validation of mixed-format scales. Educational and Psychological Measurement, 78(2), 253-271.
  • Weems , G. H., Onwuegbuzie, A. J., Schreiber, J. B., Eggers, S. J. (2003). Characteristics of respondents who respond differently to positively and negatively worded items on rating scales. Assessment & Evaluation in Higher Education, 28(6), 588-607.
  • Weijters, B., & Baumgartner, H. (2012). Misresponse to reversed and negated items in surveys: A review. Journal of Marketing Research, 59(5), 737-747.
  • Wells, C., & Wollack, J. A. (2003). An instructor’s guide to understanding test reliability. Madison: University of Wisconsin, Testing & Evaluation Services.
  • Wouters E, Booysen F. L. R., Ponnet K, Baron Van Loon F. (2012). Wording effects and the factor structure of the Hospital Anxiety & Depression Scale in HIV/AIDS patients on antiretroviral treatment in South Africa. PLoS ONE, 7(4), 1-10.
  • Ye, S. (2009). Factor structure of the General Health Questionnaire (GHQ-12): The role of wording effects. Personality and Individual Differences, 46, 197-201.
There are 41 citations in total.

Details

Primary Language English
Subjects Studies on Education
Journal Section Research Article
Authors

Seval Kartal 0000-0002-3018-6972

Publication Date January 16, 2021
Acceptance Date September 2, 2020
Published in Issue Year 2021 Volume: 29 Issue: 1

Cite

APA Kartal, S. (2021). Examining Scale Items in Terms of Method Effects Based on the Bifactor Item Response Theory Model. Kastamonu Education Journal, 29(1), 201-209.

10037