Click images for more details



Recent comments
Recent posts
Currently discussing

A few sites I've stumbled across recently....

Powered by Squarespace
« Met Office forecasts | Main | Data archiving »

A response from Prof Hardaker

With commendable speed, Professor Hardaker, the CEO of the Royal Meteorological Society has responded to my email asking for a statement on the Society's position on one of its journals standing in the way of an attempt to replicate a study published there.

Thanks for your note. I've had a couple of emails relating to this discussion and the position currently is as Prof McGregor mentioned. As I have mentioned to others who have emailed in, I'm very happy to consider the requirement for a clear policy statement and as suchI have put this on the agenda for the next meeting of the Society's Scientific Publishing Committee, at which all the Editors of the Society's journals are members. 

While I had hoped for at least some sort of recognition of the need for replication, it may be that Prof Hardaker feels he can't commit the Society to a new policy one way or the other without discussing the matter with his colleagues, and this is not an unreasonable position.

I would hope that the Society would start out from a position that every published study should be replicable and that both the data and code to do so should be in the public realm at the time of publication. The oft-cited policies of econometrics journals would be as good a place to start as any.

My concern would be that the committee ducks the issue by putting in place a bland policy stating that authors should make data available on request or some such, which then merely opens the question of what happens should authors refuse to do so - would the journal withdraw the paper? Would they sue the authors for breach of contract?.  Assuredly not. It is surely in the interests of the Society to avoid having to deal with any of this kind of unpleasantness by ensuring that the data and code are handed over up front.

It will be interesting to see what they come up with.


PrintView Printer Friendly Version

Reader Comments (12)

It's pretty amazing that a scientific journal doesn't have a data policy, At least it's on the agenda. As you observe, econometrics journal policies show that the policy can be done pretty easily. Journals often have webforms asking about financial interests. At a minimum, the journals could have a webform asking the authors to state that they have archived their data and giving a URL. They may or may not check it, but at least the author has warranted that it is there and this warranty can used.
Jan 5, 2009 at 2:32 PM | Unregistered CommenterSteve McIntyre
Thank you for contacting this group. I feel that at this point any scientific or policy papers should be treated the same as stock offerings and not be allowed to be published until archiving requirements are met. That includes theory, data used to come to that theory, procedures used process that data, and test procerures & results used to verify that theory. With this policy in place we would probably see a lot fewer papers to be published. therefore needing fewer editors and publshers.
Jan 5, 2009 at 2:46 PM | Unregistered CommenterMike Davis

I do wonder if this kind of thought process must be going through the heads of Prof Hardaker and his colleagues as well. If their data archiving policies are stringent, what will be the effect on the rate of submission of papers to their journals? I'm not sure what the impact rating of IJC is but it's presumably lower than Phil Trans B, so the fact that the Royal Society can enforce a tough policy doesn't necessarily mean that TMS will feel that they can do the same thing. This is not to say that they shouldn't - obviously they have a moral duty to do so - but it's possible that other considerations will carry more weight in the final assessment.
Jan 5, 2009 at 4:04 PM | Registered CommenterBishop Hill
There is no excuse these days for not archiving data and code. It is so easy to do and if done prior to publication, it ensures that the data and code are consistent with the paper and are there for ever. It costs the author very little in time or money.
Jan 5, 2009 at 5:36 PM | Unregistered CommenterPhillip Bratby
Well done Bishop!
Jan 5, 2009 at 6:29 PM | Unregistered CommenterTonyN
I have been involved in the preparation of many prospectuses for fund-raisings in the Australian equities market. A standard requirement these days is that the preparation of EVERY prospectus issued include preparation of a Verification File that provides the evidence for every material statement made in the prospectus, and demonstrates the reasons that the Directors of the company feel justified in making that statement.

The Verification File is kept as part of the supporting documentation for each Prospectus, and can be relied upon by Directors in the event of subsequent legal action as proof that they undertook "reasonable man" due diligence that the statements that they are making are correct.

It is strange, especially given the history and philosophy of science as taught in schools and universities, that comparable Verification Files are apparently not required in many areas of science. Does that really mean that commercial practitioners are adhering to higher professional standards than scientists?
Jan 5, 2009 at 7:21 PM | Unregistered Commenterstoreman norman
Storeman norman

I don't know about Oz, but the history and philosophy of science are not part of the UK science curriculum any longer.
Jan 5, 2009 at 7:26 PM | Registered CommenterBishop Hill
Storeman norman

In the nuclear industry that I was familiar with, a verification file was standard practice. It kept everyone honest and ensured all methods, data and code were archived.
Jan 5, 2009 at 7:38 PM | Unregistered CommenterPhillip Bratby
So, it's a little victory for one of the self appointed, secret (you are anonymous) policemen of science....

I don't regard scientist as guilty until proven innocent by anonymous self appointed secret policemen.
Jan 5, 2009 at 10:20 PM | Unregistered CommenterPH

What a strange comment. Who does assume scientists are `guilty until proven innocent'? Not me. It is true to say though that data and code, which are funded by the taxpayer should be public, both because the public has a right to information that it has paid for and because science advances faster if there is access, and also because it keeps science honest - David Goodstein, of the California Insitute of Technology has commented that the possibility that someone will try to replicate a piece of work is a powerful disincentive to cheating. Fraud is not unknown in science, in case you hadn't noticed.

I should also point out that the correspondence between myself and the RMS was made under my own name.
Jan 5, 2009 at 10:42 PM | Registered CommenterBishop Hill

Bishop Hiil, on his internet Blog, under his own name, is an anonymous "secret policeman"?

What you been smokin, dude?
Jan 6, 2009 at 2:12 AM | Unregistered CommenterPhilH
I'm a programmer in the pharmaceutical industry, and we are legally obliged, when submitting a new drug application, to provide our data and algorithms (not necessarily code) to the FDA so that they can replicate our statistical analyses.

One wouldn't expect the FDA to make a decision about the safety and efficacy of a new drug based solely on the drug company's own claims - clearly there has to be at least the possibility that every result might be checked, to remove any incentive to bend the truth.

Similarly, one wouldn't expect a government to make decisions with even more serious impacts without similar replication and verification.
Jan 6, 2009 at 9:06 AM | Unregistered CommenterChris Long

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>