This is a guest post by Demetris Koutsoyiannis
I fully endorse Donald W. Braben’s statement, posted in Times Higher Education and linked in Bishop Hill’s post, that the current research councils’ system for selection of proposals for funding is
fundamentally flawed and a pathway to mediocrity.
I wish to offer an example and a suggestion for an alternative procedure. My example tries to offer additional evidence in support of the above statement. My suggestion is based on the approach Mother Nature follows in her selection procedures. These procedures always involve randomness, whose power has increasingly been recognized (cf. genetic, evolutionary and simulated annealing algorithms in optimization).
A. The example
I have put online my research proposal to the European Research Council—ERC (actually its first part, which was reviewed) along with the anonymous comments of panels and individual reviewers (I have not made public the signed documents as those who signed might not want me to publish their documents). My proposal was rejected twice by ERC, in 2008 and in 2011.
The interested reader can access and assess the proposal and its reviews himself. In my reading, there is a marked improvement in the reviews of 2011 with respect to those of 2008, the comments of which were rather general-purpose “copy/paste” text. However, the result was the same in both cases—rejection of the proposal. Of course, the panel’s recognition that:
The PI is extremely well known internationally, with an excellent publication and citation record and important international scientific responsibilities, and is recipient of prestigious awards (Review Panel).
is of little solace. Some of the reasons the panel invokes for rejection, in my view are reasons justifying funding of my proposal. I quote below a few examples followed by my comments:
Many of the statements are more philosophical than technical, and no details are given on how the different activities will be connected (Review Panel).
The aversion of the Panel to philosophical issues and its clear preference for technicalities and details are really saddening.
While ambitious and potentially important, the PI bases none of this research on current understanding and wishes us to accept that brand new thinking is required in all aspects of the proposed work. I find it difficult to accept that decades of climate change and hydrologic research have proven nothing useful (Reviewer 2).
No, I do not wish this reviewer to accept the necessity of brand new thinking; I would not even try to convince the reviewer that research is all about improving and often abandoning current understanding. But I wish the ERC did not use rhetoric that is inconsistent with its practices. It is deceitful, on the one hand, to announce encouraging of groundbreaking research and, on the other hand, to use reviewers with such convictions about research.
[T]he proposed approach based on "a novel mathematical framework to quantify uncertainty in nature" including the development of "a new hydroclimatic theory" is not, as described, fully convincing (Reviewer 3).
Would a proposal whose summary (because just the summary was evaluated) was fully convincing for a typical reviewer really justify funding? Can ground-breaking research be fully convincing at the point of its announcement in the form of a summary?
The level of research funding seems patchy – despite the statement of “generous funding from Greek authorities” it seems that current funding is small (Reviewer 4).
In other words, only those who already have sufficient funding are eligible for funding. What an argument!
First, in order to promote one’s own research area it is not necessary to attack other research areas, especially when that attack seems unjustified (Reviewer 4).
It seems as if the reviewer dislikes attacks on established ideas. Can research be groundbreaking without attacking some established wisdom?
A second, and more serious, concern is the lack of specificity within the proposal, both concerning the tools that will be used and the data that will be exploited.
So now the reviewer has made crystal clear what innovative research is: it is a collection of tools and their application on data, provided that the collection and application have an appropriate level of specificity and, of course, the proposal does not make attacks. Furthermore, this research can be funded provided that the PI has already a large number of ongoing funded projects.
B. The suggestion
My suggestion for a better system is very simple and includes the following three steps:
1. Apply an initial screening of the proposals submitted, to exclude those which have been submitted for fun, those prepared by random text generators, those overusing clichés and those copied from existing documents (today there are reliable tools available to check the latter).
2. From the remaining proposals select those to be funded by lot. Lottery is a reliable system; it is used by Nature in evolution, and it was intensely used and highly appreciated in the Athenian democracy. Of course the details of the lottery need to be carefully studied. The system could assign prior probabilities based on measurable and objective criteria in order, for example, to penalize allocation of all funds to the same persons or consortia and to reward productive research efforts in the past.
3. After commissioning of the project, perform controls that the funds were allocated correctly and that the research produced accountable results (if not, ask for the money back).
I contend that such a system is clearly superior to the current systems applied by research councils worldwide, because it will not block novelty and innovation, which naturally attack conventional wisdom and establishment (cf. D. W. Miller, The Government Grant System: Inhibitor of Truth and Innovation?, J. Inf. Ethics, 16(1), 59-69, 2007).