Session Information
09 SES 12 A, Findings from International Comparative Achievement Studies: Methodological Challenges
Paper Session
Contribution
International achievement studies (such as PISA, PIRLS, and TIMSS) are today conducted on a regular basis, the results of these studies often being the basis on which nations make educational decisions. In these studies a common test is used which is translated into the languages of the participating countries. When several different-language test versions are employed, it is important to ensure that the versions are equivalent, or comparable, to each other – that they measure the same construct at a comparable level of difficulty. If this is not the case, the validity of inferences made on the basis of the test is threatened.
Clearly, translating achievement tests and ensuring their equivalence is a most responsible task. Rigorous translation procedures have therefore been developed to guarantee high-quality and equivalent translations. However, no agreement seems to have been reached on how to best translate these tests, judging from the widely differing translation procedures followed in the organizations administering the tests (e.g. the IEA and the OECD). Also, both research and experience suggest that there have been problems when implementing the procedures and translating the tests (e.g. Hambleton, 2002, 2005) and that equivalence may not always have been attained (e.g. Bechger et al., 1998; Bonnet, 2002; Ercikan & Koh, 2005; Guérin-Pace & Blum, 2000).
The purpose of this paper is to discuss problems and issues encountered when translating international achievement tests and to look for solutions to them. Ultimately, the paper aims at developing translation procedures in these tests and at increasing the validity of the tests.
Method
Expected Outcomes
References
Bechger, T., van Schooten, E., de Glopper, C., & Hox, J. (1998). The validity of international surveys of reading literacy: The case of the Reading Literacy Study. Studies in Educational Evaluation 24, 99-125. Bonnet, G. (2002). Reflections in a critical eye: On the pitfalls of international assessment [Review of the book Knowledge and skills for life: First results from PISA 2000]. Assessment in Education 9, 387-399. Ercikan, K., & Koh, K. (2005). Examining the construct comparability of the English and French versions of TIMSS. International Journal of Testing 5 (1), 23-25. Guérin-Pace, F., & Blum, A. (2000). The comparative illusion: The International Adult Literacy Survey. Population: An English Selection 12, 215-246. Hambleton, R. (2002). Adapting achievement tests into multiple languages for international assessments. In A. Porter & A. Gamoran (Eds.), Methodological advances in cross-national surveys of educational achievement (pp. 58-79). Washington: National Academy Press. Hambleton, R. (2005). Issues, designs, and technical guidelines for adapting tests into multiple languages and cultures. In R. Hambleton, P. Merenda & C. Spielberger (Eds.), Adapting educational and psychological tests for cross-cultural assessment (pp. 3-38). Mahwah, NJ: Erlbaum. Harkness, J. (2003). Questionnaire translation. In J. Harkness, F. van de Vijver & P. Mohler (Eds.), Cross-cultural survey methods (pp. 35-56). Hoboken, NJ: Wiley. Larson, M. (1998). Meaning-based translation. A guide to cross-language equivalence (2nd rev. ed.). Lanham, MD: University Press of America. Munday, J. (2001). Introducing translation studies. Theories and applications. London: Routledge.
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.