I just came across this beautiful website on which Northeastern University’s Barabasi Lab created an interactive visualization of Roberta Sinatra and colleagues’ paper “Quantifying the Evolution of Individual Scientific Impact”. The paper argues that scientists’ most impactful publications (as measured by citations) could occur at any point in their career. The authors base their argument on a large-scale bibliographic dataset containing publications of more than 10,000 scientists in different disciplines.
The visualization looks like a life-line for an individual scientist’s work (picture on the left) or an ocean with wave peaks and valleys for whole disciplines (picture on the right). It is striking how the overall disciplinary patterns look very similar – at least when they are corrected for absolute citation counts. The authors found that impact is randomly distributed within a scientist’s career. So, if you haven’t published an impactful paper yet, don’t give up – it might just be the next one.
Scientometrics published our article Editorial governance and journal impact: a study of management and business journals. It examines how characteristics of editors, in particular the diversity of editorial teams, are related to journal impact. Our sample comprises 2244 editors who were affiliated with 645 volumes of 138 business and management journals. Results show that multiple editorships and editors’ affiliation to institutions of high reputation are positively related to journal impact, while the length of editors’ terms is negatively associated with impact scores. Surprisingly, we find that diversity of editorial teams in terms of gender and nationality is largely unrelated to journal impact. Our study extends the scarce knowledge on editorial teams and their relevance to journal impact by integrating different strands of literature and studying several demographic factors simultaneously.
As reported in my last post, we’re currently analyzing keywords of articles in organization and management journals. I used wordle to provide a first visual summary for our blog. The “top500” wordcloud detailed hardly any methods. The new wordcloud below is exclusively based on keywords with methodological classifications and shows their “top250” terms. It seems, stochastic models and regression analyses are most frequently indexed.
Our current bibliometric study includes over 85.000 articles from approx. 170 management journals published between 2000 and 2013. The wordcloud below shows the top 500 keywords used in these articles. In the end, it looks like studying organization and management is all about performance. Related concepts, e.g. firm performance or competitive advantage are also among the top 500. Another important subject includes keywords such as knowledge, information, technology, and innovation. Behavioral constructs, e.g. trust, satisfaction, or commitment are also among the most visible attributes of articles, although of less significance. Methods are hardly indexed in the top 500 wordcloud.
The body of available management literature has grown considerably during the past years. A search in Thompson Reuters’ ISI Web of Knowledge shows 19,143 articles published in the journals listed in the Journal Citation Reports for Busines and Management in 2013. Meaning 52.4 new articles were published per day – or one every 27.5 minutes. These numbers have almost doubled since 2003 when “only” 30.9 articles were published per day. And they are more than five times as much as 1993 (9.7 articles per day).
But how may we interpret these numbers? Is this increase a sign of scientific advancement, as treated by most rankings and performance-management systems? Or does it indicate a dysfunctional “inflation” of publications without further enlarging the scientific knowledge-base?
Either way, the vast majority of publications are rarely recognized by the scientific community (i.e. cited) at all. Of course, it’s not possible for scholars to keep up with the development of management science by reading all published articles. But considering this trend, we should ask ourselves wether there are more effective ways to communicate our scholarly results and to contribute to the development of new knowledge.
Let’s take a closer look at the scholars behind the publications. For that purpose, I’ve inverted the network from the last post . Now, vertices represent authors and a tie indicates a journal, in which both authors have published. The ties are stronger if both authors wrote several articles for the same journals. The size of a node represents the number of articles in our database from that particular author.
Again, we can identify our four clusters:1) Upper left: sociological studies 2) Upper right: organization and management studies 3) Lower left: research on higher education management and 4) Lower right: studies on technology transfer and science communication.
Now, a high betweenness centrality of an author indicates boundary spanning research by contributing to journals in different discourses. The authors with the highest betweenness are: 1 Cynthia Hardy (1350), 2 Loet Leydesdorff (723), 3 Ase Gornitzka (643), 4 Georg Krücken (627) and 5 Karl Weick (610). If an author has a high degree centrality (i.e. the node has many direct ties) he or she is well connected in terms of journal diversity. In contrast to the betweenness measure, diversity in this case most likely refers to journal diversity within a certain discourse. The top five authors according to degree centrality are: 1 Barbara Sporn (27), 2 Royston Greenwood (26), 3 Georg Krücken (24), 4 Dennis Gioia (22), 5 Christopher Hinnings (22).
Of course, the shown results are limited in terms of generalizability. The measures reflect only a part of the authors’ works and should not be interpreted as an indicator for performance. Besides, the network is only based on 68.6% (more like 40%, since I’ve removed quite some isolates) of all the publications that happend to find their way into our database, which, obviously, is pre-selected by subjective preferences for specific theories, contexts, and methods. However, the networks provide useful information about the foundations of our work at a glance. I wonder how the actual collaboration network looks like..
Over the past three years, we have gathered and sighted literature related to higher education governance and the organization of knowledge work. Now, it is time to open the black box and take a look into a good part of the research that our work is based on. In the RePort project, we’ve used Mendeley as a collaborative tool, which helped us to consolidate publications from the Leuphana and Hamburg teams.
Almost 1,400 authors contributed to 930 studies (avg. 1.5 authors per publication). The most common form of publication is journal articles (68.6%), followed by chapters in edited volumes (14.5%). Monographs (7.2%), working papers (4.8%), project reports (1.8%), dissertations (1.7%), and conference proceedings (1.4%) are of less importance.
The figure displays a network of related journals in our database. The journals are shown as nodes (the size depends on the number of articles in the respective journal). The authors are displayed as ties between the nodes. Two journals are connected if one author contributed an article to both journals. The tie strength indicates if more than one author has published in both journals. The resulting network displays denser clusters of stronger interrelated journals and structural holes with no author connecting the journals. We identify four clusters (three large and one small) in the network. They can be interpreted as outlets for four distinct groups of scholars.
1) Upper right: studies on technology transfer and science communication.
2) Upper left: research on higher education management.
3) Lower right: organization and management studies.
4) Lower left: sociological studies.
The most important journals in terms of betweenness centrality (i.e. the number of shortest paths from all nodes to all others that pass through that node) are: Academy of Management Review (401), Higher Education (327), Research Policy (227). These journals attract scholars from different discourses.
Note that the network only covers a small part of the database, since it only contains journal publications. Besides, a threshold was set for 2 authors. Journals with only one author contributing an article to another journal in the database or journals without a connecting author at all (“isolates”) were eliminated for better visualization.