Revising the Metacognitive Awareness of Reading Strategies Inventory (MARSI) and testing for factorial invariance
PDF

Keywords

metacognition
reading comprehension
metacognitive awareness
reading strategies

How to Cite

Mokhtari, K., Dimitrov, D. M., & Reichard, C. A. (2018). Revising the Metacognitive Awareness of Reading Strategies Inventory (MARSI) and testing for factorial invariance. Studies in Second Language Learning and Teaching, 8(2), 219–246. https://doi.org/10.14746/ssllt.2018.8.2.3

Number of views: 1878


Number of downloads: 990

Abstract

In this study, we revised the Metacognitive Awareness of Reading Strategies Inventory (MARSI), a self-report instrument designed to assess students’ awareness of reading strategies when reading school-related materials. We collected evidence of structural, generalizability, and external aspects of validity for the revised inventory (MARSI-R). We first conducted a confirmatory factor analysis of the MARSI instrument, which resulted in the reduction of the number of strategy statements from 30 to 15. We then tested MARSI-R for factorial invariance across gender and ethnic groups and found that there is a uniformity in student interpretation of the reading strategy statements across these groups, thus allowing for their comparison on levels of metacognitive processing skills. We found evidence of the external validity aspect of MARSI-R data through correlations of such data with a measure of the students’ perceived reading ability. Given that this journal is oriented to second language learning and teaching, our article also includes comments on the Survey of Reading Strategies (SORS), which was based on the original MARSI and was designed to assess adolescents’ and adults’ metacognitive awareness and perceived use of ESL reading strategies. We provide a copy of the MARSI-R instrument and discuss the implications of the study’s findings in light of new and emerging insights relative to assessing students’ metacognitive awareness and perceived use of reading strategies.
https://doi.org/10.14746/ssllt.2018.8.2.3
PDF

References

American Educational Research Association, American Psychological Association, National Council on Measurement in Education, Joint Committee on Standards for Educational and Psychological Testing (U.S.). (2014). Standards for educational and psychological testing. Washington, DC: American Educational Research Association.

Bentler, P. M. (2004). EQS 6: Structural equation program manual. Encino, CA: Multivariate Software.

Bråten, I., & Strømsø, H. L. (2011). Measuring strategic processing when students read multiple texts. Metacognition and Learning, 6, 111-130.

Byrne, B. M. (1988). The self-description questionnaire III: Testing for equivalent factorial validity across ability. Educational and Psychological Measurement, 48, 397-406.

Byrne, B. M., Shavelson, R. J., & Muthén, B. O. (1989). Testing for equivalence of factor covariance and mean structures: The issue of partial measurement invariance. Psychological Bulletin, 105, 456-466.

Cheung, G. W., & Rensvold, R. B. (2002). Evaluating goodness-of-fit indexes for testing measurement invariance. Structural Equation Modeling, 9(2), 233-255.

Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Hillsdale, NJ: Lawrence Erlbaum.

Cromley, J. G., & Azevedo, R. (2006). Self-report of reading comprehension strategies: What are we measuring? Metacognition and Learning, 1, 229-247.

Desoete, A., & Özso, G. (Eds.). (2009). Metacognition [Special issue]. International Electronic Journal of Elementary Education, 2(1).

Dimitrov, D. M. (2012). Statistical methods for validation of assessment scale data in counseling and related fields. Alexandria, VA: American Counseling Association.

Garner, R. (1987). Metacognition and reading comprehension. Norwood, NJ: Ablex.

Gersten, R., Fuchs, L., Williams, J., & Baker, S. (2001). Teaching reading comprehension strategies to students with learning disabilities: A review of research. Review of Educational Research, 71, 279-320.

Guan, C. Q., Roehrig, A. D., Mason, R., & Meng, W. (2010). Psychometric properties of Meta-cognitive Awareness of Reading Strategy Inventory. Journal of Educational and Developmental Psychology, 1, 3-17.

Hacker, D. J., Dunlosky, J., & Graesser, A. C. (1998). Metacognition in educational theory and practice . Mahwah, NJ: Lawrence Erlbaum.

Hadwin, A.F., Winne, P.H., Stockley, D. B., Nesbit, J., & Woszczyna, C. (2001). Context moderates students’ self-reports about how they study. Journal of Educational Psychology, 93, 477-487.

Hancock, G. R. (2004). Experimental, quasi-experimental, and non-experimental design and analysis with latent variables. In D. Kaplan (Ed.), The SAGE handbook of quantitative methodology for the social sciences (pp. 317-334). Thousand Oaks, CA: Sage.

Hartman, H. J. (Editor). (2001). Metacognition in learning and instruction: Theory, research and practice. Dordrecht, The Netherlands: Kluwer Academic Publishers.

Hu, L.T., & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling, 6, 1-55.

Israel, S. E., Block, C. C., Bauserman, K. L., & Kinnucan-Welsch, K. (Eds.). (2005). Metacognition in literacy learning: Theory, assessment, instruction, and professional development. Mahwah, NJ: Lawrence Erlbaum.

Jöreskog, K. G., & Sörbom, D. (1979). Advances in factor analysis and structural equation models. Cambridge, MA: Abt Books.

Linacre, J. M. (2002). Optimizing rating scale category effectiveness. Journal of Applied Measurement, 3(1), 85-106.

MacNamara, D. S. (2007). Reading comprehension strategies: Theories, interventions, and technologies. Mahwah, NJ: Lawrence Erlbaum.

MacNamara, D. S. (2011). Measuring deep, reflective comprehension and learning strategies: challenges and successes. Metacognition and Learning, 6, 195-203.

Marsh, H. W., Hau, K. T., & Wen, Z. (2004). In search of golden rules: Comment on hypothesis testing approaches to setting cut-off values for fit indexes and dangers in overgeneralizing findings. Structural Equation Modeling, 11, 320-341.

Messick, S. (1989). Validity. In R. L. Linn (Ed.), Educational measurement (3rd ed., pp. 13-103). New York, NY: American Council on Education.

Messick, S. (1995). Validity of psychological assessment: Validation of inferences from persons’ responses and performances as scientific inquiry into score meaning. American Psychologist, 50, 741-749.

Mokhtari, K., & Reichard, C. (2002). Assessing students’ metacognitive awareness of reading strategies. Journal of Educational Psychology, 94(2), 249-259.

Mokhtari, K., & Sheorey, R. (2002). Measuring ESL students’ awareness of reading strategies. Journal of Developmental Education, 25(3), 2-10.

Muthén, L. K., & Muthén, B. O. (1998-2012). Mplus user’s guide (7th ed.). Los Angeles, CA: Muthén & Muthén.

Pearson, P. D., & Gallagher, M. C. (1983). The instruction of reading comprehension. Contemporary Educational Psychology, 8, 317-344.

Pressley, M. (2000). What should comprehension instruction be the instruction of? In M. L. Kamil, P. B. Mosenthal, P. D. Pearson, & R. Barr (Eds.), Handbook of reading research, (Vol. 3, pp. 545-561). Mahwah, NJ: Lawrence Erlbaum.

Schellings, G., & van Hout Wolters, B. (2011). Measuring strategy use with self-report instruments [Special issue]. Metacognition and Learning, 6(1).

Snow, C. (2002). Reading for understanding: Towards an R&D program in reading comprehension. Santa Monica, CA: Rand Education.

Veenman, M. V. J. (2011). Alternative assessment of strategy use with self-report instruments: A discussion. Metacognition and Learning, 6, 205-211.

Veenman, M. V. J., Van Hout-Wolters, B. H. A. M., & Afflerbach, P. (2006). Metacognition and learning: Conceptual and methodological considerations. Metacognition and Learning, 1, 3-14.