Buy

Books
Click images for more details

Twitter
Support

 

Recent comments
Recent posts
Currently discussing
Links

A few sites I've stumbled across recently....

Powered by Squarespace
« An uncritical love affair with environmentalism | Main | Another congressional inquiry »
Saturday
Apr022011

Nature Climate Change launches

Nature's long-awaited climate change journal has finally launched. It looks as though it's all free in this first issue, so why not take a look?

This article about openness in climate change research was interesting, with several familiar faces interviewed. There is something about the tone of the piece that makes me uncomfortable though - perhaps a slightly promotional feel?

PrintView Printer Friendly Version

Reader Comments (35)

Just a list of special pleading . we are too tired! the computers broke! it's the wrong code/data/ file system ! Miss their copying me work!!
"One problem, Carlson notes, is that people who have worked hard to collect and organize their data are often reluctant to give it away. They worry that if they hand it out, someone may use it to publish a paper that the original researcher would eventually have written, given more time".
Sorry am I wrong but anyone is asking for unpublished data from unwritten/unpublished papers are they ? I thought this was all about the stuff that got into print in journals that have data openness policies?

Apr 2, 2011 at 9:22 AM | Unregistered Commentermat

But it is urgent, if we don't get those papers written right now the earth will explode, so how can they be so selfish as to hold back the data and delay saving the world?

Apr 2, 2011 at 9:46 AM | Unregistered CommenterThe Englishman

I've got a couple of observations here:

(1) Nick Barnes et al seem quite keen to port the climate code from Fortran to Python. Is there any justification behind this? Fortran is an old scientific language, but does the job that it is designed to do, namely 'FORmula TRANslantion". Python is a dynamic language that is fashionable amongst web developers, (eg Google), but is it the right tool for scientific computation? Moreover, I believe Fortran is still widely used in, for example, geophysical seismic processing.

(2) Steve Easterbrook writes Particularly with complex pieces of software, it takes a lot of work to make it available in ways that people can understand it
So, to Steve Easterbrook, for modern commercial software development, it takes a lot of effort to produce software that people can understand. In particular, the fashionable methodology of Test Driven Development (TDD) can result in a lot of code that has no commercial value other than to test the software and make it understandable to others. However, it is felt that the apparently extra effort to produce the code is worth it from a commercial perspective. Could this approach actually benefit your software development effort?

Apr 2, 2011 at 10:09 AM | Unregistered Commenterandyscrase

To some extent, I can appreciate the first "Sticking point"
"One problem, Carlson notes, is that people who have worked hard to collect and organize their data are often reluctant to give it away. They worry that if they hand it out, someone may use it to publish a paper that the original researcher would eventually have written, given more time"
This has occured in Paleontology, with the Burgess Shale fossils, Charles Walcott's expeditions excavated huge numbers of fossils between 1909 & 1924, many were not studied during his lifetime, simply due to him having insufficient time to do so. They were regarded as "His fossils" and fellow paleontologists were unwilling to poach on his territory, as it were. Some fossils having been described & published, others not.
However, how long should such primarcy be "respected"?
Shades of the Keenan/Queens University of Belfast arguments.
Another point,
"Scientific standards
By traditional measures climate science is already open. Researchers publish their results in peer-reviewed journals; they share data with one another; they present at conferences and collaborate on projects"
Well, we've seen that some climate scientists are happy to share data (Even that subject to NDAs) with those who share their views, but not with those who may not.
We've also seen what degree of peer revue is applied to some papers (Where "pal revue" is a more appropriate term) compared to others and that they'll collaborate with some, but collaboration with others may "harm their careers"
Schmidt's argument that it takes longer, holds no water with me.

Apr 2, 2011 at 10:31 AM | Unregistered CommenterAdam Gallon

Is "Nature's long-awaited climate change journal" a cunning plan to keep all that rubbish out of Nature proper in future, and thus try to recover the reputation of that journal?

Apr 2, 2011 at 11:10 AM | Unregistered Commenterdearieme

@andyscrase

Python is already being used instead of Fortran as a language of choice in undergraduate science courses. (e.g. see this link for physics undergraduates at Durham University).

It is well supported with a rich set of libraries and development tools, all of which are freely available on the web, so it's a good choice for open source development.

There are also many programmers with Python skills in the professional market - so for people like Gavin Schmidt who aren't motivated to write code that is independently usable or testable, they can use some of their government research money to hire professionals to do it for them.

Apr 2, 2011 at 11:28 AM | Unregistered CommenterR2

The article by Parmesan et al on "attribution" of biological changes to climate change is important. The authors recognise that establishing causality is difficult but then they say it should be done through modeling. Their explanation how to do this is bafflegab. Moreover, modeling cannot establish causation until the models' predictive capabilities are successfully tested. In the study of climate, this entails time scales longer than one's lifetime. The task therefore becomes difficult and immediate conclusions become useless.
It perhaps is a positve step in that the IPPC approach to attribution (causation) is criticised.

Apr 2, 2011 at 11:37 AM | Unregistered CommenterMorley Sutter

Even though Schmidt says that greater data-sharing is good for science, he's not sure how much effort should go into making it easy for people to simply re-run his code. It's more important that other climate modellers can use their own models to replicate his results. “Of all of the things that I can do that are important, is allowing reproducibility of my code on somebody else's computer important? No, that's not important,” he says.

Schmidt sums up his attitude to science,

Apr 2, 2011 at 12:16 PM | Unregistered Commentergolf charley

@andyscrase

I did a lot of coding at university in FORTRAN a long time ago. I now use Python for a view things, albiet probably as mostly a hobby but doing useful things none the less. From what I've seen, Python and the resources it brings would be a great contribution to the analysis of climate change data.

Apr 2, 2011 at 12:49 PM | Unregistered CommenterRob Schneider

The article is meaningless. If it was published in a reputable science journal it would have some validity but because it is published in a campaigning journal concerned with global warming and environmental alarmism it is worthless.

Tha article is further devalued by the ramblings of Gavin Schmidt a member of the Fiddlestick Team.

Our Gav just dont get it:-
It it looks like a Climate Scientist
Talks like a Climate Scientist
Then it's a Quack.

Apr 2, 2011 at 1:04 PM | Unregistered CommenterStacey

The article is meaningless. If it was published in a reputable science journal it would have some validity but because it is published in a campaigning journal concerned with global warming and environmental alarmism it is worthless.

Tha article is further devalued by the ramblings of Gavin Schmidt a member of the Fiddlestick Team.

Our Gav just dont get it:-
It it looks like a Climate Scientist
Talks like a Climate Scientist
Then it's a Quack.

Apr 2, 2011 at 1:05 PM | Unregistered CommenterStacey

I was a bit amused that the graphic in the nature article.

http://www.nature.com/nclimate/journal/v1/n1/fig_tab/nclimate1057_F1.html

Appears to have been manually manipulated - the base line is labelled 0.6 when it should be 0.0? I don't think software would have produced this mistake, someone has been adjusting the axis' manually.

If you look at the referred to source image here http://climatecode.org/blog/2011/02/nature-figure/ the figure is correct!

Its a beautiful example of shoddy work getting published, clearly without any proper peer review.

The whole article is about transparency and then they manipulate their own "evidence"! The difference is pretty harmless, but what else have they fiddled with for the sake of "presentation"

" Advocates for greater transparency make three basic arguments for publishing code and data: it will make the science replicable, it will make scientists accountable and it will improve the overall quality of the science."

That's true- how about Nature explain how and why the graphic was manipulated for this article?

Apr 2, 2011 at 1:07 PM | Unregistered CommenterMark Cooper

Mark

I respectfully point out that your post contains some mistakes. For example, incorrect use of a question mark at the end of sentance that starts with "Appears to have been manipulated...", incorrect apostrophe on the word "axis", missing apostrophe in line starting "its a beautiful example...". I'd guess there's a good chance I've included some typos myself in typing this.

You highlight "manipulation" as a demonstration of lack of peer review. I only see a lack of proof reading. I guess it comes down to how much you believe in conspiracies.

Apr 2, 2011 at 2:02 PM | Unregistered CommenterPeter Brown

MArk
Good catch there. The figure is from Nick Barnes et al.

You can see a detailed exposition about this figure here. You can read about it a bit here, in this thread as well. Interestingly enough, on their site Barnes et al, seem to have a correct graph.

This entirely defeats the whole purpose of demonstrating dynamically generated graphs by running code - as opposed to using commercial software to create snapshot graphs. How can there be 'typos' of this kind in dynamically generated graphs? It is difficult to not conclude that someone has 'touched' the graph in some way, post-production.

Although I support Nick's efforts in relation to code and data access, it was surprising to see how Nick argued at length to establish that 'all modern science is computer models' (which I take to mean 'all modern science involves computer modeling to various extent) to imply therefore that, if we believe most science at face value, we should a priori do so with climate models as well.

Secondly it was mildly disappointing to see Nick argue vehemently, offering a (uncalled for) broad-brush criticism, that sceptics have no right or business to ask climate scientists for data as all they do is indulge in conspiracy mongering.

The Nature article does something related as well. Long-running sceptics - McIntyre, the most prominent amongst them, but the Bishop, Mosher, Jeff Id, Keenan... there are quite a few, have been very vocal, from day one, in demanding access to data and methods (for *published* papers of all things). The whole saga of Climategate and the Jones urban heat island story can be traced back to *requests for data*. The climate science 'community' response was to rationalize not providing data or helping with methods, *because* those asking for data were sceptics. Today, this article provides so much space and column inches to people inside the same community about data sharing, and there isn't a single word about sceptics and their contribution to things reaching this stage.

Hmm...I wonder why that would be.

Apr 2, 2011 at 2:07 PM | Unregistered CommenterShub

Peter Brown,

Point changes in graphs such as these - generated on the fly - are to be considered 'manipulated' unless and otherwise, it can be proven not to be so.

Whole-image manipulation - sharpening, brightness and contrast etc are 'allowed' changes in scientific image generation. Manipulation, deletion, smudging or regional retouching, are a strict no-no. In fact, many journals employ specifically written software to detect manipulation of this kind. This is especially true in earth imaging and biologic sciences.

By this, I am not implying that Barnes et al/Nature Climate Change did something in the same order, but the end product falls in the same class.

Apr 2, 2011 at 2:13 PM | Unregistered CommenterShub

Let us be quite clear on who the editorial staff are. Paraphrased from the magazine website;

“Chief Editor:
Olive Heffernan
Her research has focused on biodiversity, sustainable fisheries management and climate impacts on cod.

Senior Editor:
Jo Thorpe
Jo spent two years as a scientific adviser in the UK civil service, first in the Department of Energy and Climate Change and then supporting the Government's Chief Scientific Adviser.

Associate Editor:
Alastair Brown
Alastair was based with the UK Climate Impacts Programme (UKCIP) at the University of Oxford. Alastair studied environmental science for his first degree, followed by a Masters in global environmental change, both at the University of Plymouth.”

So a single hymn sheet will suffice?

Apr 2, 2011 at 2:26 PM | Unregistered Commentersimpleseekeraftertruth

The future value of an investment in collecting the data is a very vexing one. Assuming that error is detectable in publications in some cases only with reference to the data themselves, there should be some value to being helped to avoid a career based on misinterpretation. i once worked with a gentleman, now departed, who sat on a dissertation committee at a US University in an experimental science much immersed in statistics. He was infamous for taking the data and demonstrating that not only did it not support the assertions of the candidates, but often something entirely different.

Given that one might find his career handed to him in a basket, isn't it really better to be found out earlier than later - while there is time to recover?

And this is without even considering the careers of others who might base their work on the assumption of reliability of your cloistered efforts.

The greater good has to come with publishing the data.

Apr 2, 2011 at 2:27 PM | Unregistered Commenterj ferguson

This seems to be the Python code that generates the graph:

stationplot.main(
("stationplot -o nature.svg --axes=yyr --offset 0,-0.2,0 -c rscale=6000;yscale=300;ytick=0.2;legend=none;buginkscapepdf=1 -s 0.01 -y -d nature.v2 %s %s %s" % tuple(labels)).split())
print "generating PDF..."

Apr 2, 2011 at 3:00 PM | Unregistered CommenterShub

I use Python a lot in my own job - it's a very high level language, easy to understand and code, certainly far easier to program in than Fortran that I also use on occasion. Python is particularly good for data manipulation / parsing files and text etc. I'm sure climatologists would love it!

Apr 2, 2011 at 3:19 PM | Unregistered CommenterAlan

It is far less than we all hoped for; indeed, it is far less than they could have hoped for.

Apr 2, 2011 at 3:59 PM | Unregistered CommenterPascvaks

Alan

Python is particularly good for data manipulation / parsing files and text etc. I'm sure climatologists would love it!

Python is just another interpreter language, like Basic. True it is a very easy to understand and write language, but when you are doing serious numerical calculation, such as done in complex regression analysis you really want every bit of floating point precision you can get. From the several implementation of Python I have looked at, "float" variables are 32 bit, not the minimum 64 bit or 128 bit floating point you need.

Now, don't get me wrong. Python is very nice for the uses you list, easier to use than perhaps PHP but all things considered, it is NOT a computational language. Fortran is the language of choice for that, as are some forms of C++ that support double and quad floating precision.

Apr 2, 2011 at 4:04 PM | Unregistered CommenterDon Pablo de la Sierra

I take the above back. I looked at the Python 2.7 specification and I found:

numbers.Real (float)

These represent machine-level double precision floating point numbers. You are at the mercy of the underlying machine architecture (and C or Java implementation) for the accepted range and handling of overflow. Python does not support single-precision floating point numbers; the savings in processor and memory usage that are usually the reason for using these is dwarfed by the overhead of using objects in Python, so there is no reason to complicate the language with two kinds of floating point numbers.

Mea culpa -- Should have looked at the latest and greatest. Python is okay, if it is a double precision version (64-bit)

I tried posting this a few minutes after I posted my last posting, but the great and terrible TIME OUT bug bit.

Apr 2, 2011 at 6:28 PM | Unregistered CommenterDon Pablo de la Sierra

Don Pablo,

I understood that Python used 64-bit floats by default, and had library support for arbitrary precision (e.g. gmpy). Were you thinking of the precision of integers?

And I seem to remember reading somebody's Fortran climate code using integers to represent temperatures times 100, so you only ever got two decimal places.

Python lets you put wrappers around C/C++ libraries, so you can do anything they can do, with the additional advantages of clearer code, more flexible and powerful structured programming, and better tools. Python is by no means a panacea, and I don't think I'd have picked it myself, but Fortran is simply like something out of the Ark in comparison.

Apr 2, 2011 at 6:42 PM | Unregistered CommenterNullius in Verba

Oops. Should have refreshed before posting.

Apr 2, 2011 at 6:45 PM | Unregistered CommenterNullius in Verba

And I should have checked before posting :) I used to use an earlier version of Python that was 32-bit floating point. That was years ago. I haven't used it for many years (it is a very old language, actually) and use C++ most of the time.

Apr 2, 2011 at 7:12 PM | Unregistered CommenterDon Pablo de la Sierra

"By traditional measures climate science is already open."

I've got this low-cost, famous BRIDGE I'd like to sell you....

"It is dangerous to be sincere unless you are also stupid."
--George Bernard Shaw

Apr 2, 2011 at 9:26 PM | Unregistered CommenterPeter D. Tillman

Of course, by "traditional", NCN means the trad Phil Jones policy, "Why should I give you my data when all you want to do is find something wrong with it...."

Apr 2, 2011 at 9:28 PM | Unregistered CommenterPeter D. Tillman

Their "about" page is somewhat curious. Contains a list of 21 "topics" - including "Paleoclimate*". Beneath the list one finds:

*Nature Climate Change will publish cutting-edge research on the science of contemporary climate change, its impacts, and the wider implications for the economy, society and policy. Thus, while we certainly appreciate the importance of palaeoclimate research in its own right, we can only consider for publication palaeoclimate studies that shed significant new light on the nature, underlying causes or impacts of current climate change.

Is this the beginning of the end for treemometers? Or a signal that from their perspective, the "controversy" is over and they will brook no challenge?!

Apr 2, 2011 at 10:15 PM | Unregistered Commenterhro001

hro001

Re brooking no challenge, that is an interesting question. I suppose we shall see in due course.

Apr 2, 2011 at 10:58 PM | Unregistered CommenterBBD

everybody knows the difference between a scientist and a "scientist"

A scientist explains people what he did. A "scientist" hides what he did.

Apr 3, 2011 at 12:43 AM | Unregistered CommenterHans Erren

I would like to emphasise the importance of publishing BOTH data and CODE.
Steve McIntyre laboured for years trying to replicate the work of various warmists.
His specialy is in auditing both code and methods.

Far too often the published results do not flow from the wording of how the work was carried out.
To check that the results are valid, you need to be able to verify that the code does what the description says it does (checking for errors),
And also to be able to check that the experimental design is valid.

Raw data and results are not nearly good enough unless accompanied by code and meta data.

Apr 3, 2011 at 4:42 AM | Unregistered CommenterAusieDan

I have just picked up the bit from the "about" section of the new "Nature Climate Change", where it says:
QUOTE
we can only consider for publication palaeoclimate studies that shed significant new light on the nature, underlying causes or impacts of current climate change.
UNQUOTE

Now, everybody understands that the climate always changes.
But what is all this about "current climate change".
Have I missed something?
Has anybody identified any different happening in the climate that has not occurred many times before?

And what is all that about "palaeoclimate studies"?
Has there been a recent breakthrough that makes this academic exercise suddenly relevant to the study of climate?
No. I though not.

So what is the reason for this new journal then, if it is not about climate studies?

Apr 3, 2011 at 4:55 AM | Unregistered CommenterAusieDan

Quite so, BBD ... we shall certainly see in due course (not to mention the fullness of time)!

But there are further curiosities in the articulation of their paleoclimate exclusionary caveat (for want of a better phrase). They appear to be interested only in "contemporary" and "current" climate change.

YMMV (as may that of others) but this suggests to me that Nature Climate Change is doing its level best (no pun intended) to move the goal posts so that all they will publish are papers that reinforce the mantra that it's worse than they thought and happening faster than they thought, so we must act now!

IOW, it's BAU for anyone who aspires to publish in big Nature

Apr 3, 2011 at 10:55 AM | Unregistered Commenterhro001

From the Ecclesiastical Uncle, an old retired bureaucrat in a field only remotely related to climate, with minimal qualifications and only half a mind.

WUWT had a thread (Mar 29 2011) on the Pidgeon & Fischoff’s article Communicating Climate Risks and I made a contribution (Mar 30 2011 9.51am). Afterwards I regretted not adding the following:

Two polemicists’ lament on the difficulties of their calling, presumably of comfort to Bob Ward in the UK and Joe Romm in the US.

So I have done it here.

Apr 3, 2011 at 12:36 PM | Unregistered CommenterEcclesiastical Uncle

If the Quacks have taken over the editorial board at the Science mag then God help science.
" There is only Physics, Stamp Collecting and Climate Change Science."
Apologies to Lord Rutherford.

Apr 3, 2011 at 1:22 PM | Unregistered CommenterStacey

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Post:
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>