Session Information
09 SES 12 C, Computer-Based Assessments
Paper Session
Contribution
Theoretical framework
The focus of current discussions on complex problem solving abilities has been shifted to more complex and real-life problems within the last ten years (e.g., Funke & Frensch, 2007). The ability to solve complex problems is considered as one of the key competencies and is to be fostered in science education (Rocard Report, 2007).
Within the PISA 2003 problem solving framework six different problem solving steps were postulated which are necessary to solve a complex problem successfully (OECD, 2004). Combined with the German National Curriculum of Chemistry these steps were differentiated into four statistically separate competencies (e.g., Tiemann, Koppelt & Scherer, 2010): (1) understanding and characterizing the problem (PUC), (2) representing the problem (PR), (3) solving the problem (PS), and (4) reflecting and communicating the solution (SRC). They can be described on four proficiency levels. The tasks used in the assessment tools of the present study were designed to evaluate students’ performances on complex problem solving referring to each of the four competencies in the domain of chemistry.
Research Questions
Problem solving abilities are regarded as key competencies (e.g., OECD, 2004) and relate to inquiry-based concepts within science education in classrooms (e.g., Rocard Report, 2007).
Current discussions on complex problem solving (CPS) and scientific inquiry competencies have been shifted from investigating differences between experts and novices to the analysis of competence development across different grades (e.g., AAAS, 1993). So-called “learning progressions” in science are considered as tools, which can lead to the “development of more focused standards, better designed curricula, better assessments, and ultimately more effective instruction and improved student learning of science” (Corcoran, Mosher & Rogat, 2009, p. 17). Due to this potential, there are different calls for the development of competence models which are able to describe students’ ability pathways across different grades (e.g., Koeppen, Hartig, Klieme & Leutner, 2008).
The project consequently deals with the question of modeling and assessing complex problem solving competence within the domain of chemistry in different grades and its development.
Objectives
As mentioned above, the development of learning progressions focused on aspects of scientific literacy such as domain-specific problem solving abilities has to take the methods and results of empirically based research in science education as well as the contextualized character of competencies into account. The project aim is to develop theory-driven and computer-based assessment tools, which can be used to assess students’ complex problem solving competencies across different grades. These tools contain practically useful tasks referring to the German National Chemistry Curricula. Furthermore, a vertical scale will be established to maintain direct comparisons of students’ competencies differentiated into four subdimensions of CPS.
Method
Expected Outcomes
References
AAAS (1993). Benchmarks for scientific literacy. Project 2061. New York: Oxford University Press. Corcoran, T., Mosher, F. A. & Rogat, A. (2009). Learning progressions in science. CPRE Research Report RR-23. Philadelphia, PA: Consortium for Policy Research in Education. Retrieved from: http://www.cpre.org/images/stories/cpre_pdfs/lp_science_rr63.pdf [8/1/2011]. Funke, J. & Frensch, P. A. (2007). Complex problem solving: The European perspective – 10 years after. In D. H. Jonassen (Ed.), Learning to solve complex scientific problems (pp. 25-47). New York/London: Lawrence Erlbaum Associates. Koeppen, K., Hartig, J., Klieme, E. & Leutner, D. (2008). Current issues in competence modeling and assessment. Journal of Psychology, 216 (2), 61-73. Kolen, M. J. & Brennan, R. L. (2004). Test equating, scaling, and linking. New York: Springer Science+Business Media. OECD (2004). Problem solving for tomorrow’s world – First measures of cross-curricular skills from PISA 2003. Paris: OECD Publications. Rocard-Report (2007). Science education now: A renewed pedagogy for the future of Europe. Retrieved from http://ec.europa.eu/research/science-society/document_library/pdf_06/report-rocard-on-science-education_en.pdf [2/1/2011]. Tiemann, R., Koppelt, J. & Scherer, R. (2010, August). Computer based assessment of complex problem solving in chemistry. Paper presented at The European Conference on Educational Research (ECER), Helsinki, Finland. Van der Linden, W. (2005). A comparison of item-selection methods for adaptive tests with content constraints. Journal of Educational Measurement, 42, 283-302. Wirth, J. (2004). Selbstregulation von Lernprozessen [Self-regulation in learning processes]. Münster: Waxmann.
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.