If you want to know more about different research designs with references to seminal works have a look at this YouTube page from Graham R. Gibbs (University of Huddersfield, UK). He provides pretty goods lectures on a variety of important methodological topics, such as research quality (e.g., validity, credibility, etc.) and ethics, but also pretty detailed, but good to follow introductions to specific research techniques (e.g., grounded theory and coding approaches, surveys and questionnaire design, case study research and interviewing techniques, ethnography, experiments and quasi-experiments).
Archiv der Kategorie: Research ethics
Trouble in the lab
An even more advanced and detailed account in The Economist of insufficient research rigor, lack of reporting negative findings and scientis‘ need for training in statistical methods. Read it here.
The Economist on problems with scientific research
An excellent reflection on scientific methodological rigor (or rather a critic) was recently published in The Economist. Read it here.
University of Cambridge ranks third in QS World University Ranking 2013
According to the recently released QS World University Rankings 2013 of universities on this planet, the University of Cambridge ranks third behind MIT (#1) and Harvard (#3). Oxford is #6.
The first German university on the list is Heidelberg on #50, followed by TU Munich (#53) and LMU Munich (#65). The University of Hamburg ranks #186 and Kiel University scores #293.
Across Swedish universities, Chalmers is #5 ranking overall #202.
What the academic peer review process can learn from the European patent system
The academic peer review process has recently been criticized. The current system works like this: Journal editors receive submissions and forward those for review usually to between one and three reviewers. Then, the paper goes through a number of rounds where it is bounced back and forth between the editors, the reviewers and the authors. The authors are asked to alter their paper until the reviewers and editors are satisfied with certain academic standards that vary depending on the journal quality (e.g., whether it is an “A” or a “C” journal). However, the process appears rather closed and in-transparent. Here is a suggestion, of how the review process could be altered to improve the quality of the finally published papers.
The European patent system has installed the so called “opposition procedure”. Since there are many similarities between patents and academic papers, why not think about journals installing an opposition procedure in the academic peer review process? Papers, likewise as patents, should usually be new, hence include some kind of inventive element. Thereby, each paper is unique or idiosyncratic and, among other similarities, both types are linked to certain kinds of intellectual property.
This is, in a nutshell, how the EPO patent opposition procedure works: After the grant of a patent has been announcedin the European Patent Bulletin (which would be similar than announcing acceptance of a paper), every third party can file an opposition, but only for a maximum of nine months after that announcement. How could such system work in academia?
Today, most, if not all journals have an internet site, where one can access accepted articles. Some journals even announce articles in press on those sites or publish online versions of accepted papers prior to the upcoming publication date. Why not having a third article category available on the websites where journals publish accepted articles that are available for opposition or rather third-party comments. Such category might be labeled “accepted, available for opposition”. Only abstracts might be published there, with access to the full version being only available by request to the responsible editor or after simple registration on the website. Then, any interested reader could be allowed to file oppositions against an article. The journals might want to include those papers in the news alert that are mailed out or simply allow readers to sign up for another news alert covering only articles in this category.
In which stage of the publication process journals want to make articles available for opposition needs further discussion. This could be done already fairly early after a first R&R has been sent to the authors, assuming that most authors than stick to a journal. One might also argue that this should be done much later in the publication process, similarly to the EPO opposition procedure that allows oppositions only after a patent’s grant has been announced. I might favor the latter option.
Such an additional procedure would open up the academic peer review system allowing third-parties to comment on papers in a pre-publication stage. This would help to avoid a situation that most of us have probably encountered. You browse through a news alert of a journal that is closely related to your area of expertise and come across a recently published article that catches your interest. You start reading it and recognize that the authors miss out certain important aspects, arguments or lack relevant sources. Then, however it is too late. Should you really bother to contact the journal editors and get into a difficult dialogue with them about an article that has already been published? Most of us would rather be careful avoiding to accuse editors of a mistake in the review process, particularly if hoping to publish an article in that particular journal in the future. It might also simply to be too late to make any changes to the article concerned, if it has been published and printed already. An opposition procedure would provide a relatively easy way to allow third-parties to engage in a dialogue with an editor before a paper is ultimately published and „carved in stone“.
Now, let us briefly think about another important aspect. The peer review process has also been criticized to take too long and many journals are working on accelerating it. Would the addition opposition procedure counteract those efforts? Maybe. However, when journals announce articles in press, those papers usually await publication anyway for several months. In that period, an opposition procedure could be installed. Assuming that oppositions might be filed by far not on all articles, the risk of prolonging the publication process seems to be small or rather only to exist for a small portion of articles. However, the big benefit instead is that for articles that get opposed and where editors see the need to ask the authors for another revision based on a valid opposition, the article quality can be expected to increase. Hence, particularly those (A-ranked) journals that apply high academic standards might start as pioneers to install an opposition procedure. This process actually might not be labeled opposition procedure, but rather in a way that it indicates more of a dialogue between the editors and potential readers thereby opening up the peer review process.
Similar to the EPO procedure, the journals may even establish clear guidelines on the ground of what aspects of a paper can be opposed and propose ethical guidance of how the opposition should be phrased to be in line with the aim of such a procedure, namely to improve academic publication quality. Modern means of ICT technology allow such a system to be installed for relatively low costs, although it might come at increased coordination efforts for the editors and for authors, if they have to alter an article late in the publication process.
Retraction watch is a blog to follow, if you are interesting in research ethics and particularly how the German Lichtenthaler case develops.
Kiel University (CAU) among top third of German universities in Shanghai Ranking
As usual, the top ranking German universities come from the southern part (e.g., TUM # 1, Heidelberg # 2, LMU # 3). However, from a total number of 38 German universities, which made it into the Top 500, Kiel University makes it to the upper third. It ranks between # 9 and # 14, depending on the field and subject. CAU ranks closely to the University of Hamburg and above most of the German technical universities (e.g., Dresden, Berlin, Aachen, Darmstadt). Not so bad.
University of Cambridge ranks #5 world wide
Academic Ranking of World Universities, also known as the „Shanghai Ranking“ released it 2013 ranking listing the University of Cambridge overall on # 5. Cambridge is the first non-American university after Harvard, Stanford, Berkeley and MIT. Among the top 10 universities are only two non-US (Cambridge at # 5, Oxford at # 10). The next non-US university is ETH Zurich at # 20.
Across different fields Cambridge makes it to # 2 in Life and Agriculture Sciences Cambrige. In engineering it is # 14, in medical # 6, and social science # 16. Broken down further by subject, Cambridge ranks # 17 in „Economics / Business“ just after one other non-US university (London School of Economics and Political Science, #13). In Mathematics it is # 4, Physics # 8, Chemistry # 4 and Computer # 38.
Hence, Cambridge makes it to # 1 in the UK, just above Oxford and UCL.
Research Policy editorial on research ethics
Another recommendation from a senior on the AoM junior faculty consortium at Orlando turned out to be a good read. Research policy published an editorial written by Ben Martin in 2013 on academic misconduct „Whither research integrity?“. It summarizes different types and uses several explanatory cases. A version of that editorial is also available as presentation for download here.
A must read for every junior faculty
During the TIM junior faculty session at this year’s AoM in Orlando I got a pretty good advice. Read this article. I just did and must agree. It helps to see things a bit differently. Highly recommended.
The Awesomest 7-Year Postdoc or: How I Learned to Stop Worrying and Love the Tenure-Track Faculty Life
Science editorial argues against impact factors
Bruce Alberts, Editor-in-Chief of SCIENCE recently argued strongly against the use of impact factors and other quantitative measures for judging an individual scientist’s work. A short, but convincing argument. Have a look here.
A must see video! Bias due to studies that do not get published
Not exactly innovation research, but probably relevant for any academic discipline: How we are biased in our believe in empirical results through studies that remain unpublished, because they fail to prove significant results and hence do not get published:
Ben Goldacre: What doctors don’t know about the drugs they prescribe