Session Information
09 SES 03 A, Findings from PISA: Reading and Literacy Issues (Part 3)
Paper Session
Contribution
Rational
Role of reading in solving mathematics problems
The important role of language, including reading, on student performance on assessments in school academic areas such as mathematics has been addressed in number of studies (e.g., Treacy, 1944; Cummins, Kintsch, Reusser, & Weimer, 1988; Rothman & Cohen, 1989; Lepik, 1990; Hembree, 1992). Although positive relations between different aspects of reading and mathematics performance have been investigated (McIntosh & Bear, 1993; Pyke, 2003; Voyer & Sullivan, 2003), there have been few studies focusing on the interaction between the context of the mathematics item and student reading proficiency, and most of those there were related only to the verbal loading (e.g., Bleistein & Wright, 1987). Recently, a low correlation between verbal loading and item difficulty was found for PISA mathematics items (OECD, 2010).
Psychologically, according to Artzt and Armour-Thomas (1992), reading was one of the six meta-cognitive categories for protocol analysis of mathematics problem solving: read, analyse, explore, plan/implement and verify. Therefore, in a certain mathematics test, solving some items may require more reading ability while solving other items depending on their constructs or formats may require less.
PISA mathematics and research questions
The Programme for International Student Achievement (PISA) is an ongoing, periodic international comparative study of the proficiency in mathematics, reading, and science of 15-year-old students. The survey was first conducted in 2000 and has been repeated every three years since. There were 41 countries participating in PISA 2003, in which mathematics was the main focus. PISA has a literacy orientation in all three domains. The item formats in PISA include multiple-choice, complex multiple-choice, closed constructed-response, short response and open constructed-response.
In relation to item format and differential performance in international tests, Lapointe, Mead, and Askew (1992) noted that the various item formats that were used were not equally familiar to students from all TIMSS countries. In other study using TIMSS data, O’Leary (2002) showed that the choice of item formats (multiple-choice, short response, and open constructed-response) could be one of the factors influencing the rankings of countries. Klieme and Baurmert (2001), in a study that investigated country DIF in six countries (Austria, France, Germany, Sweden, Switzerland, and the United States) for TIMSS mathematics items with upper secondary students combined with analyses of the cognitive demands of test items, showed some relative strengths and weaknesses of students from each of the countries by item content demands.
Specifically, this study implemented an Item Response Theory method to detect the Differential item functioning (DIF) between lower and higher reading ability groups in each PISA country. Relative weaknesses and strengths of these student groups were identified and discussed for item format. The main research questions were as follows.
· Are there unexpected differences in mathematics performance of students with different reading ability levels that are associated with item format?
· Is there a significant correlation between item difficulty and DIF?
· Is there a significant correlation between item discrimination and DIF?
Method
Expected Outcomes
References
Artzt, A. F., & Armour-Thomas, E. (1992). Development of a cognitive-metacognitive framework for protocol analysis of group problem solving in mathematics. Cognition and Instruction, 9(2), 137–175. Cummins, D. D., Kintsch, W., Reusser, K., & Weimer, R. (1988). The role of understanding in solving word problems. Cognitive Psychology, 20, 405–438. Hembree, R. (1992). Experiments and relational studies in problem solving: A meta-analysis. Journal for Research in Mathematics Education, 23, 242–273. Lapointe, A.E., Mead, N.A., & Askew, J.M. (1992). Learning mathematics. Princeton, NJ: Educational Testing Service. Lepik, M. (1990). Algebraic word problems: Role of linguistic and structural variables. Educational Studies in Mathematics, 21, 83–90. Klieme, E., & Baumert, J. (2001). Identifying national cultures of mathematics education: Analysis of cognitive demands and different item functioning in TIMSS, European Journal of Psychology and Education, 19(3), 385–402. Masters, G. N. (1982). A Rasch model for partial credit scoring. Psychometrika, 47, 149–174. McIntosh, M. E., & Bear, D. (1993). Directed reading-thinking activities to promote learning through reading in mathematics. The Clearing House, 67, 40–45. OECD (2010). PISA Learning Mathematics for Life: A Perspective from PISA. OECD, Paris. O'Leary, M. (2002). The stability of country rankings across item formats in TIMSS. Educational Measurement: Issues and Practice, 21(4), 27–38. Pyke, C. L. (2003). The use of symbols, words, and diagrams as indicators of mathematical cognition: A causal model. Journal of Research in Mathematics Education, 34(5), 406–432. Rothman, R. W., & Cohen, J. (1989). The language of math needs to be taught. Academic Therapy, 25, 133–142. Treacy, J. P. (1944). The Relationship of Reading Skills to the Ability to Solve Arithmetic Problems. Journal of Educational Research, 38, 86–96. Wu, M. L., Adams, R. J., & Wilson, M. R. (1997). ConQuest: Generalised Item Modelling Software. ACER: Australian Council for Educational Research, Australia.
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.