Our study on scholars’ attitudes towards academic journals’ peer-review has been published by Managementforschung (MF). Here’s the abstract: Peer review in academic publishing relies on the voluntary engagement of scholars who are, at best, committed to that practice. Current debates on peer review suggest that this commitment is diminishing. Conceptualizing peer review as an instance of social exchange allows us to propose a conceptual model of commitment to peer review and test it by means of a structural equation analysis. Our empirical study is based on survey data from the social sciences (n = 359). Results show that authors are more committed to the practice of peer review if reviewers base their recommendations on rational arguments so that authors can trust them for their competence. By contrast, benevolent reviewers who try to collaborate with authors are not effective in fostering trust and commitment. Within the limitations of our data and with regard to reviewers’ behaviors and characteristics, we cannot support sweeping criticisms of the operational reliability of academic journals.
Access the article here: Hattke, F., Bögner, I., & Vogel, R. (in press): (Why) Do You Trust Your Reviewers? Influence Behaviors, Trustworthiness, and Commitment to Peer Review. Managementforschung (MF), 1-26.
The results of an international web-based survey on journal editors in four disciplines were published in Research Evaluation. The article is titled “How innovative are editors?: evidence across journals and disciplines”. Journal editors play a crucial role in the scientific publication system, as they make the final decision on acceptance or rejection of manuscripts. Some critics, however, suspect that the more innovative a manuscript is, the less likely it will be accepted for publication. Especially top-tier journals are accused of rejecting innovative research. As evidence is only anecdotal, this article empirically examines the demand side for innovative research manuscripts. I assess journal editors’ innovativeness, i.e. their general predispositions for innovative research manuscripts. As antecedents to innovativeness, personal and contextual factors are taken into account. I differentiate the concept of innovativeness in research by distinguishing three dimensions: innovativeness in terms of research problems, theoretical approaches, and methodological approaches. Drawing on an international web-based survey, this study is based on responses of 866 journal editors. The article sheds light on editors’ inclination toward accepting different forms of innovative research for publication. Overall, findings indicate that individual characteristics, such as editorial risk-taking or long-term orientation, are more decisive than journal-related characteristics regarding innovativeness. However, editors of older journals turn out to be less open toward new research problems and a u-shaped relationship between a journal’s rating score and editor’s willingness to adopt new theoretical approaches exists. Most surprisingly, editors’ consensus orientation regarding reviewer recommendations is positively associated with methodological innovativeness.
Access the study here: Petersen, J. (2017). How innovative are editors?: evidence across journals and disciplines. Research Evaluation, 26(3), 256-268.
Peer review is the central mechanism to verify the quality of scientific manuscripts. According to the internet platform SciRev, peer review processes are often lengthy, which delays the distribution of valuable, novel knowledge within the scientific community. To streamline this phase in scientfic knowledge dissemination, SciRev aims to increase transparency of scientific review processes across journals. Therefore, researchers are invited to evaluate their review experience with a journal based on various characteristics, such as duration of review rounds or rejection time or overall satisfaction with the review process. The information provided is aggregated into scores, which feed into a comprehensive database, so that journals become comparable.
Ultimately, researchers can search for journals with an efficient peer review procedure and benefit from timely publication while journal editors have the opportunity to compare their journal’s performance with that of others.
Check out the website to contribute to the database or benefit from your peers’ journal review experiences.
Together with other selected contributions from the last workshop higher education management, the German journal “Hochschulmanagement” (higher education management) published our paper “Open post-publication-peer-review: an alternative to double-blind reviews in academic journals?”.
The study contributes to the discussion about alternative forms of scientific communication by evaluating the actual dissemination as well as the potential use of open post-publication-peer-review (OPR). The study is based on survey data with a sample of 2.800 authors of academic papers. Results show that only one third of respondents believe that OPR is useful for enhancing the operative reliability of review processes. The advantages of OPR discussed in the literature are only relevant for the general willingness of authors to publish with OPR in principal. However, when it comes to actual publication decisions (open vs. blind peer review), these potential advantages are only of minor importance for the selection of an appropriate journal (with the exception of heterodox research which indeed seems to benefit from OPR). Instead, the choice between the different channels of scientific communication is based on institutionalized aspects (legitimacy, quality, design of the systems) and behavioral considerations (expected negative group dynamics and increased workload of OPR). Within the limitations of our dataset, we conclude that the current potential of OPR to solve the problems of traditional double-blind proesses is limited.
Bögner, I. & Hattke, F. (2016): Open Post-Publication-Peer-Review: Eine Alternative zur doppelt-blinden Begutachtung in Fachzeitschriften? In: HM – Hochschulmanagement 11(3), 69-74.
A colleague just sent me a link to a hilarious weblog that collects the best comments from reviewers. You might want to remember these quotes when you read your next reviews….
Reviewers do an important but often underappreciated job in the current publication system. To recognize their invisible and usually unpaid efforts, journals often gratefully publish the name of their reviewers by end of the year. From the perspective of reviewers, these “honorable mentions” are nice, but still somewhat scattered and disconnected.
The online network “publons” offers a more personalized approach of earning credit for reviewing. Once you have created an account, you can forward the “thank-you-for-reviewing”-mails that are automatically generated by submission systems after completion of the process. The information will be verified and added to your profile, thus recording your personal history of peer reviewing. This can be helpful to document your services to the community, e.g. in application processes.
Vom 08.10-09.10.2015 fand eine Herbsttagung der DGS-Sektion Wissenschafts- und Technikforschung in Kooperation mit dem Wissenschaftszentrum Berlin für Sozialforschung (WZB) statt. Das Tagungsthema lautete „Einheit trotz Vielfalt? Die Diversität der Wissenschaft als Herausforderung für die Forschung“. Ziel der Tagung war es, verschiedene Forschungsstrategien des methodologisch anspruchsvollen Fächervergleichs einander gegenüberzustellen und die empirischen Ergebnisse mit Blick auf die soziologische Theorienbildung und forschungspolitische Praxis zu reflektieren.
Dr. Fabian Hattke und ich hatten hierbei die Möglichkeit, unser aktuelles Forschungsthema „Double Blind Peer Review vs. Open Post Publication Review – Publikationspraktiken im Fächervergleich“ vorzustellen und mit diversen Wissenschaftlern aus unterschiedlichen Communities zu diskutieren.
Neben vielen spannenden Vorträgen gab es eine Podiumsdiskussion unter anderem mit Vertretern des Wissenschaftsrats und der Volkswagen Stiftung. Während der Diskussion fiel immer wieder auf, dass Vielfalt nicht erst auf einer übergeordneten Ebene zwischen verschiedenen Fächern entsteht, sondern bereits innerhalb der eigenen Disziplingrenzen. Mit dieser Komplexität umzugehen, stellt auch bei einer Evaluation von Forschungsergebnissen eine besondere Herausforderung dar, die nicht unterschätzt werden sollte. An diesem und weiteren Gedanken möchten wir bei unseren nächsten Forschungsschritten gerne anschließen.
One of the most frequent criticisms regarding the peer review system is a bias towards positive (i.e., confirmative) results, while negative or null findings are less likely to find approval by reviewers (although they may address no less relevant research questions and may result from equally rigorous methods).
The Journal of Business Psychology (JBP) has now announced to roll out an additional manuscript submission path that the editors hope to address the publication bias. In this alternative path, called hybrid registered reports submission path, authors submit the introduction, information on methods and measurement, as well as a plan of analysis, while no results and discussion are provided. Research is then evaluated on the merits, rigor, and quality of the project rather than what was actually found.
A particular interesting way to detect invisible colleges in scientific fields is to study acknowledgements in research publications. Authors usually thank their colleagues who spent time and effort to provide a “friendly review” of earlier drafts of the manuscript, or who have given other forms of support. When these efforts are reciprocated, networks of professional ties among scholars rise to the surface.
Sometimes acknowledgements reveal more than just professional networks. A Canadian paleontologist has recently published a paper in which he gives credit to the support from his colleague Lorma, who holds a PhD from the same university. So far, so good, but the acknowledgements continue surprisingly: “Lorma, will you marry me?” See the full CBC news report.
The Washington Post reports on a new way of cheating in academic publishing: Dubious agencies offer the service of faking peer reviews during the submission process of academic journals. A scandal revealed in the biomedical sciences suggests that these agencies fabricate contact details for reviewers and then submit favorable reviews from these addresses. Some of these accounts have the names of seemingly real researchers but with fraudulent e-mail addresses, others are completely fictitious. The blog Retraction Watch has counted a total of 170 retractions in the past few years because of fake peer reviews.
Although “fakeries” of this kind may be a threat to the integrity and reputation of academic publishing, the peer review system still enjoys high levels of commitment among researchers across all disciplines. This is a preliminary result of our survey on peer review which we conducted recently – more information soon on this blog.
Peer reviewing is a widespread procedure for evaluating the quality of scholarly manuscripts before publication. As such, it is at the heart of academia. Yet, little is known how scholars percieve the peer-review process.
Today, we initiated our survey into the peer-review system from the perspective of those who submit papers to academic journals. The survey asks questions about general attitudes towards your job, your personal experiences with peer reviews, and possible alternatives to the common pre-publication peer review process. We’ll keep you posted on the results!