Monthly Archives: November 2016

Paper published: Open post-publication-peer-review

Together with other selected contributions from the last workshop higher education management, the German journal “Hochschulmanagement” (higher education management) published our paper “Open post-publication-peer-review: an alternative to double-blind reviews in academic journals?”.

The study contributes to the  discussion about alternative forms of scientific communication by evaluating the actual dissemination as well as the potential use of open post-publication-peer-review (OPR). The study is based on survey data with a sample of 2.800 authors of academic papers. Results show that only one third of respondents believe that OPR is useful for enhancing the operative reliability of review processes. The advantages of OPR discussed in the literature are only relevant for the general willingness of authors to publish with OPR in principal. However, when it comes to actual publication decisions (open vs. blind peer review), these potential advantages are only of minor importance for the selection of an appropriate journal (with the exception of heterodox research which indeed seems to benefit from OPR). Instead, the choice between the different channels of scientific communication is based on institutionalized aspects (legitimacy, quality, design of the systems) and behavioral considerations (expected negative group dynamics and increased workload of OPR). Within the limitations of our dataset, we conclude that the current potential of OPR to solve the problems of traditional double-blind proesses is limited.

Bögner, I. & Hattke, F. (2016): Open Post-Publication-Peer-Review: Eine Alternative zur doppelt-blinden Begutachtung in Fachzeitschriften? In: HM – Hochschulmanagement 11(3), 69-74.

Call for Papers „Leistungsbewertung in der Wissenschaft – Perspektiven aus Forschung, Praxis und Politik“

On the 25th & 26th of January 2017 the conference „Leistungsbewertung in der Wissenschaft – Perspektiven aus Forschung, Praxis und Politik“ takes place at the Vorhoelzer Forum of the TU Munich.

Participants of the Federal Ministry of Education and Research funding line „Leistungsbewertung in der Wissenschaft“ will present results of their research projects. Scholars who are not part of the funding line can also submit presentations about research on topics like performance indicators, or digitalization and performance measurement. Keynotes will be held by Prof. Dr. Dr. h.c. mult. Bruno Frey, Prof. Dr. Dr. h.c. Margit Osterloh, Prof. Dr. Stefan Kühl, Dr. Ulrich Schreiterer and Prof. Dr. Birgitta Wolff.

Abstracts (max. 1000 words) can be submitted until December 9th to patrick.oehler@tum.de. For further informations take a look at the call for papers.

CfP: WK Higher Education Management @ HSU

We’re co-hosting the upcoming conference of the Scientific Commission Higher Education Management (Wissenschaftliche Kommission Hochschulmanagement im VHB) on February 21-22, 2017 at the HSU Helmut Schmidt University in Hamburg.

Abstracts (max. 1000 words excluding references) of articles dealing with the management of higher education institutions can be submitted until December 23rd 2016. The main topic is “Third party funding in higher education”. But there is also the possibility to submit papers in other areas of interest, among them:

  • New forms of governance of universities
  • Measurability of research performance
  • Open access, social media, and ctizen science
  • Peer evaluation, performance indicators, and rankings
  • Autonomy of science

For further information see the call for papers (only in German).

Publiction bias in the social sciences

I know, publication bias is not a new topic but it is still of high relevance. I found some very interesting results in a study from Annie Franco, Neil Malhotra, Gabor Simonovits published in Science (19 Sep 2014, Vol. 345, Issue 6203, pp. 1502-1505): Publication bias in the social sciences: Unlocking the file drawer. According to the authors, “only 10 out of 48 null results were published, whereas 56 out of 91 studies with strongly significant results made it into a journal.” The following figure summarizes the results:

null-results

The pattern is quite remarkable. The majority of evidence that does not support any hypothesized relationship is not even written-up in the first place. So there’s reason for doubt that special platforms or journals who publish papers with contrary findings – as it is regularly discussed for overcoming publication bias – will significantly increase the number of null results published.