Session Information
22 SES 02 C, Academic Work and Professional Development
Paper Session
Contribution
Research performance is generally seen as an important factor for economic development and societal benefits. Thus, considerable expenditures are spent for university-based research. The increasing expenditures have however increased the concern for quality and excellence in research, for transparency, accountability, comparability and competition (see European Commission 2010, p.9). Hence, stakeholders have been getting more and more interested in performance indicators, assessments and rankings. The assessments of faculties, departments or whole universities are however not only from interest with regard to improvements in research performance and in research quality, but also for strategic planning and for international benchmarking. Assessments have thus found broad prevalence for institution-intern use as well as for national and global comparisons. Moreover, the emergence of national and international initiatives for developing adequate measures and methods (e.g. EERQI Project) indicates the strong interest and the importance of such evaluations.
To date, the evaluation of bibliometric indicators is an important element for assessments of academic units and individuals, and it seems likely to be applied also in future. The popularity of these bibliometric measures relies on its supposed objective evaluation methods and the fact that it provides very condensed information. However, some experts concern about its validity due to methodological challenges, especially when it comes to assessments of social sciences and humanities (see Nederhof 2006, Neuhaus 2010). Findings suggest that the publication practices, and thus the coverage of literature in the most popular databases – notably the Thomson Reuters’ Web of Science –, vary much across (sub)fields, regions and cohorts (differences in publication type, publication source, publication language, citing habits, etc.). This is likely to result in biases.
The fact, that there is a broad use of bibliometric measures despite of some declared methodological shortcomings, enhances the importance of acquiring profound knowledge of bibliometric methods and patterns. This enables to reflect and evaluate bibliometric monitoring and to have a hand in establishing new appropriate measures. Yet, there exist only a few studies about bibliometric measures of research performance in educational sciences. Evidence about bibliometric characteristics and connections are thus very limited. With our paper, we want to make a contribution to fill this gap.
In our study, we pursue the following questions: How is research performance distributed among researchers in educational sciences? How are the different output measures related to each other? Which are the driving factors that explain the variations in research performance? How do the observed patterns vary between different indicators (quantitative vs. qualitative measures) and differing databases?
The targeted objectives of the paper are twofold: Firstly, we make a contribution to methodological demands by comparing different measures of research output and different data sources. Secondly, we identify factors that explain differences in research performance. Our results thus contribute to a better understanding and a critical discussion of existing measures of research performance in educational sciences.
Method
Expected Outcomes
References
Aaltojärvi, Inari, Ilkka Arminen, Otto Auranen and Hanna-Mari Pasanen (2008). Scientific Productivity, Web Visibility and Citation Patterns in Sixteen Nordic Sociology Departments. Acta Sociologica, 51 (1): 5-22. Botte, Alexander (2007). Scientometric Approaches to Better Visiblity of European Educational Research Publications: a state-of-the-art-report. European Educational Research Journal, 6 (3): 303-311. Carayol, Nicolas and Mireille Matt (2006). Individual and Collective Determinants of Academic Scientists’ Productivity. Information Economics and Policy, 18 (1): 55-72. Dees, Werner (2008). Innovative Scientometric Methods for a Continuous Monitoring of Research Activities in Educational Science. In: Fourth International Conference on Webometrics, Informetrics and Scientometrics & Ninth COLLNET Meeting, Berlin, 28 July -1 August 2008, pp. 1-10. European Commission (2010). Assessing Europe’s University-Based Research. Expert Group on Assessment of University-Based Research. RTD.C4. EUR 24187 EN. Hornbostel, Stefan and Edwin Keiner (2002). Evaluation der Erziehungswissenschaft. Zeitschrift für Erziehungswissenschaft, 5 (4): 634-653. Levin, Sharon G. and Paula E. Stephan (1991). Research Productivity Over the Life Cycle: Evidence for Academic Scientists. American Economic Review, 81 (1): 114-132. Moed, Henk F. (2005). Citation Analysis in Research Evaluation. Dordrecht: Springer. Nederhof, Anton J. (2006). Bibliometric Monitoring of Research Performance in the Social Sciences and the Humanities: A Review. Scientometrics, 66 (1): 81-100. Neuhaus, Christoph (2010). Vergleichende Analysen von Forschungsleistungen: Forschungsgruppen im Spiegel bibliometrischer Indikatoren. Baden-Baden: Nomos. Puuska, Hanna-Mari (2010). Effects of Scholar’s Gender and Professional Position on Publishing Productivity in Different Publication Types. Analysis of a Finnish University. Scientometrics, 82 (2): 419-437. Rauber, Michael and Heinrich W. Ursprung (2008). Life Cycle and Cohort Productivity in Economic Research: The Case of Germany. German Economic Review, 9 (4): 431-456. Van Aalst, Jan (2010). Using Google Scholar to Estimate the Impact of Journal Articles in Education. Educational Researcher, 39 (5): 387-400.
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.