Data-Driven School Improvement: Great Expectations – Little Impact?
Author(s):
Conference:
ECER 2012
Format:
Paper

Session Information

09 SES 02 B, School Monitoring and School Improvement in Different National Settings

Parallel Paper Session

Time:
2012-09-18
15:15-16:45
Room:
FCT - Aula 15
Chair:

Contribution

Current changes in the governance of the German school sector intend to foster effectiveness and efficiency of organizational and instructive practice in schools. At the heart of the ongoing quest for school improvement are efforts to gather standardized and reliable facts concerning the process and outcome dimensions of individuals’ and organizations’ quality. Scientific observers speak of evidence based policy (cf. Davies 1999) on the political level or data-driven school improvement (cf. Honig & Coburn, 2008) on the school/organizational level. There can be no doubt that educational policy and administration in Germany, as well as in many other countries, put high hopes on an “evidence based”-approach to organizational and individual development in schools. Meanwhile the success of data-driven school improvement crucially depends on two aspects.

First, requirements towards information quality comprise for example the credibility of the data source, the feedback’s timeliness, adequate complexity and fair comparisons (cf. Visscher 2001, Latham 2007). Educational scientists and administrators have made remarkable progress with regard to these prerequisites of data-driven school improvement.

Second, data-driven school improvement cannot simply be mandated by authorities. Its realization depends on the acceptance, reception and actual use of the various data sources in schools. Theoretical concepts of information utilization distinguish between several steps and forms of data use (cf. Helmke & Hosenfeld 2005, Johnson 1998). Taking this into account the project at hand offers a comprehensive and methodologically advanced view on the extent to which different types of information sources are received in schools and how teachers and head teachers evaluate their actual use. A broad range of thirteen possibly helpful information sources is reviewed. These sources encompass different types of standardized tests and assessments, school inspections, scientific information sources as well as peer observation of teaching and students’ feedback.

The central research questions put forward in our paper are:

1.)    To what extent are different information sources received by teachers and head teachers? Is it possible to identify certain patterns of reception?

2.)    How do school professionals evaluate the use of different information sources for the development of their actual work? How strong is the relation between reception and perceived use? Are there “disappointing” or “surprisingly valuable” information sources?

3.)    Which organizational and individual factors explain the extent of reception and the perceived use of different information sources?

A review of the interdisciplinary state of research (cf. van Ackeren et al. 2011) shows that potential organizational influences on data-driven school improvement are to be found in school culture and school climate, for example the climate for initiative (Baer/Frese 2003) or the psychological safety climate (Edmonson 1999) as well as in structural organizational contexts, like for example size, school type and local competition for students. Independent variables to be tested on the individual level include relevant attitudes (for example towards the objective measurability of educational achievement), job satisfaction and perceived professional autonomy. Furthermore biographical characteristics like age, job experience, functions and personal networks are taken into account.

Method

Empirical data are collected in the course of a standardized survey among teachers and school leadership teams in the German state (Land) of Rhineland-Palatinate. The responding sample comprises 1281 teachers and 290 head teachers (respectively their deputies) from 114 schools. The medium response rate within the schools lies above 65% for the members of school leadership teams and at about 45% among teachers. Large parts of the questionnaire were designed using broadly validated item sets. This ensures a high level of validity and connectivity to the multidisciplinary discourse. Self-developed item sets were tested in a two-staged pretest. Finally the survey data are complemented by official school statistics on structural school variables like school type, the number of students and teachers and the proportion of immigrant students. The collaborative research project is funded by the German Federal Ministry of Education and Research within the priority program “Research on Educational Governance (SteBis)”. A partner project focuses on in-depth case studies in eight schools. These schools are selected by means of theoretical sampling (Strübing 2009). This allows a cross-validation of the survey findings.

Expected Outcomes

Data collection was realized in two survey periods (March to June 2011 and August to December 2011). First results show that major information sources of data-driven school improvement like school inspections and standardized tests are evaluated as quite useless by most school practitioners. At the same time other sources like peer observation of teaching and students’ feedback seem to offer valued insights. Reception of the information sources under review and the evaluation of their usefulness show considerable empirical variance among individual teachers. Furthermore significant differences in the dependant variables among schools indicate the relevance of organizational context factors. All major item sets show satisfying reliabilities (Cronbach's alpha(s) ≥ 0.8). At the time of ECER 2012 it will be possible to present a detailed analysis of the (non) impact of data-driven school improvement as well as a fully developed analysis of the causal influence of organizational and individual characteristics using regression models and multilevel analysis.

References

Ackeren, I. van, Zlatkin-Troitschanskaia, O., Binnewies, C., Clausen, M., Dormann, C., Preisendörfer, P., Rosenbusch, C. & Schmidt, U. (2011): Evidenzbasierte Schulentwicklung. Ein Forschungsüberblick aus interdisziplinärer Perspektive. DDS – Die Deutsche Schule, 103 (2), 170-184. Baer, M. & Frese, M. (2003): Innovation is not enough: Climates for initiative and psychological safety, process innovations, and firm performance. Journal of Organizational Behavior, 24, 45–68. Davies, P.T. 1999: What is Evidence-Based Education? British Journal of Educational Studies, 47 (2), 108-121. Edmondson, A. (1999): Psychological safety and learning behavior in work teams. Administrative Science Quarterly, 44, 350-383. Helmke, A. & Hosenfeld, I. (2005): Standardbezogene Unterrichtsevaluation [Standard-based teaching evaluation]. In: G. Brägger, B. Bucher & N. Landwehr (Eds.), Schlüsselfragen zur externen Schulevaluation (pp.127–151). Bern: h.e.p. Honig, M. E., & Coburn, C. (2008): Evidence-based decision making in school district central offices: Towards a policy and research agenda. Educational Policy, 1(4), 578-608. Johnson, R.B (1998): Toward a theoretical model of evaluation utilization. In: Evaluation and Program Planning, H. 1, 93-110. Latham, G. (2007): A speculative perspective on the transfer of behavioral science findings to the workplace: "The times they are a-changin". In: Academy of Management Journal, H. 50, S. 1027-1032. Strübing, J. (2009): Grounded Theory. Zur sozialtheoretischen und epistemologischen Fundierung des Verfahrens der empirisch begründeten Theoriebildung, 2. Aufl. Wiesbaden: VS. Visscher, A. J. (2001): Publieke schoolprestatie-indicatoren: de problemen op een rij. In: Dijkstra, A.B./Karsten, S/Veenstra, R./Visscher, A.J. (Hrsg.): Het oog der natie: scholen op rapport. Standaarden voor de publicatie van schoolprestaties. Assen: van Gorcum, S. 54-61.

Author Information

Christoph Rosenbusch (presenting / submitting)
Johannes Gutenberg University
Center for Quality Assurance and Development
Mainz
Denise Demski (presenting)
University of Duisburg-Essen
Essen
University of Duisburg-Essen
Johannes Gutenberg University, Germany

Update Modus of this Database

The current conference programme can be browsed in the conference management system (conftool) and, closer to the conference, in the conference app.
This database will be updated with the conference data after ECER. 

Search the ECER Programme

  • Search for keywords and phrases in "Text Search"
  • Restrict in which part of the abstracts to search in "Where to search"
  • Search for authors and in the respective field.
  • For planning your conference attendance, please use the conference app, which will be issued some weeks before the conference and the conference agenda provided in conftool.
  • If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.