Click images for more details



Recent comments
Recent posts
Currently discussing

A few sites I've stumbled across recently....

Powered by Squarespace
« Spectator debate | Main | Steig story on Spectator cover »

Myles' fludd

Lots of people pointing to the Richard Black posting on floods. This includes papers by such familiar names as Myles Allen. No time to comment myself, but here's a thread for those that want to discuss it.

PrintView Printer Friendly Version

Reader Comments (90)

Quelle Surprise!

Even Gavin Schmidt thinks these papers are jumping to conclusions.

Feb 17, 2011 at 1:58 PM | Unregistered CommenterMac

And Tewkesbury in 2007 was at about the same level as 1760.
Ask most of the riparian owners up the Warwickshire Avon and they will tell you they see a direct correlation between flood levels between Stratford and Evesham and development in the flood plain and in the river's general catchment area.
I know of places in Central Scotland with similar problems. Where you build 2000 houses on a hillside you get flood problems at the bottom of the hill. You also get localised drought problems in dry summers.
This is not rocket science nor is it something that computer modellers and "environment correspondents" appear to understand.
Those of us who have been directly affected are, of course, just talking rubbish.

Feb 17, 2011 at 2:10 PM | Unregistered CommenterSam the Skeptic

One of the comments under Schmidt's article on RC spells out the game very clearly. Stuff science then - go for the amygdala.

Edward Greisch says:
17 Feb 2011 at 8:41 AM

A whole bunch of big storms, floods, droughts and fires are things that can invoke the fear necessary to get action on GW.

Feb 17, 2011 at 2:14 PM | Unregistered CommenterSayNoToFearmongers

As soon as I saw this on the BBC News (did it make the ITV version, I wonder?) I thought that we wouldn’t have heard a peep if Myles Allen’s models had revealed no difference between his hypothetical atmosphere and the one ‘as it actually was’.

He would have been diverted to other duties PDQ as well...

Feb 17, 2011 at 2:44 PM | Unregistered CommenterJames P

The way this science was pushed and how it is now being negatively received will make it a lot harder in the future to attribute extreme or single weather events to climate change. Initial scepticism will be turned into outright cynicism on such future pronouncements.

Alas, it would appear one-step forward, two-steps back for the alarmists on this one, and all within a day.

Feb 17, 2011 at 2:45 PM | Unregistered CommenterMac

"Mercury is nasty stuff you know (except when they put it in a flu vaccine of course)"

Or teeth...

Feb 17, 2011 at 2:48 PM | Unregistered CommenterJames P

Roger Pielke Jnr is not impressed either.

So Schmidt, Curry and Pielke are lining up against these papers' findings.

Crikey - who would have thought!

Feb 17, 2011 at 2:50 PM | Unregistered CommenterMac

Whilst I suppose the amount of BS that can be produce by Garbage In – Gospel Out computer models is endless, but this particular lump stinks upon high.

Rainfall in the UK for the year 2000 = 1232.4mm and ranked 243 out of 245! Lots of precipitation! However it was wetter in 1768 and 1872

Rainfall in the UK for the year 2010 = 812.3mm and ranked 46 out of 245! Not a lot of precipitation. WOW 30% reduction, how long before somebody attributes it to – guess?

Just for interest, not that I think it has got anything to do with it

Atmospheric CO2 in 2000 = 369.40ppm
Atmospheric CO2 in 2010 = 389.78ppm

PS, take a look at the way UK CET temps are going, since the “watershed” of 2000

Feb 17, 2011 at 2:52 PM | Unregistered CommenterGreen Sand


It rained and it kept on raining
And folks to the roof ‘ad to climb
They said ‘twas the rottenest summer
That Bury ‘ad ‘ad for some time...

(from Three Ha’pence a Foot, one of Stanley Holloway’s monologues, and played regularly by me as a child, on a 78rpm record)

Feb 17, 2011 at 2:57 PM | Unregistered CommenterJames P

"a phenomena that influences precipitations in a global scale - and the only thing we can think of is the changing composition of the atmosphere."

Does it seem bizarre to anyone else that these forecasts of doom depend on the breadth of imagination of the investigators?

And especially if the thing isn't actually happening - ie increase in severity of floods etc?

"We cannot think of any other cause?" Maybe it's who is asked to think of another cause?

Feb 17, 2011 at 3:01 PM | Unregistered Commenterj ferguson


I am not sure where you got this:

Strong La Niña conditions persist across the tropical Pacific. Computer models surveyed by the Bureau suggest the current La Niña event will persist into the southern hemisphere summer. For routine updates and comprehensive discussion on any developments regarding El Niño and La Niña, please see the ENSO Wrap-Up.

But it does point to one of the major drivers of world climate and yearly weather. The El Niño and La Niña are really cyclic changes in the currents of the Pacific Ocean, which are on turn driven by solar heating. And both were known to cause devastation floods back in the days of the Incas in Peru.

CO2 has about as much effect on those currents as does the wake from the fish swimming in it.

If you could, would you post the URL? I would appreciate it.

Feb 17, 2011 at 3:09 PM | Unregistered CommenterDon Pablo de la Sierra

From Lubos Motl -The Reference Frame:

Japan wastes 80 billion dollars on global warming biomass boondogles.

Nincompoops in DECC are also urging this crap to be adopted in the UK.

Feb 17, 2011 at 3:41 PM | Unregistered CommenterBrownedoff

Don Pablo

In case Jerry's not around, it's here:

Feb 17, 2011 at 4:02 PM | Unregistered CommenterBBD

The 9:26AM comment by Geronimo with the paper he cites by Czymzik et al. is interesting - it provides the sort of data needed to evaluate models. Contrary to what some people here write, there's nothing wrong with computer models. But they do need to be evaluated. One sentence from the Myles Allen paper is also interesting - "So does further evaluation of our modelling set-up, although evaluation of extreme event statistics is hampered by limited historical records." If one was being uncharitable, that might be taken to read that they haven't done much by the way of evaluation... Also, their model leads to quoted percent probabilities - but that is only within the range of model and parameter variables that they have explicitly considered. Some evidence that the range of models and parameters covered is actually correct might be nice. In the absence of such checks, the reported alarmist results are kind of meaningless. But yet again all over the media - no one in Chris Huhne's department will care about the 'yes buts' emerging from places like this.

Feb 17, 2011 at 4:12 PM | Unregistered Commenterj

Don Pablo

If you are interested, Bob Tisdale keeps a very close eye on the progress of ENSO events (and much else besides). Best place I know for a clear look at what is going on in ocean surface temps and OHC.


Agreed. Very interesting post. Thank you.

Feb 17, 2011 at 4:15 PM | Unregistered CommenterBBD

It would appear that Myles used the Met Office model that is used for seasonal probalities/predictions/forecasts. That in itself is problematic considering all the Met Office caveats now have in force in using this particular model.

It also transpires that the methodlogy used is completely new and unique, now that is even more problematic. This is how Steig, Jones, Mann and others got into to trouble.

Feb 17, 2011 at 4:24 PM | Unregistered CommenterMac

Anyone who has visited delightful Bodiam Castle in Kent, might have been astonished, as I was, by the information plaque which overlooks the river, beyond the tea shop.
All sorts of interesting stuff about the history of the place, and the fact that the river (flowing quietly some five feet below where you stand to read the plaque) was used some 400 years ago to transport materials for iron smelting.
Then - WHAM..!! At the end, whoever wrote the info implies that you'd better enjoy where you're standing now, because due to global warming, in a few years time the river will be overflowing..!
Well - I retreated quickly, I can tell you, in case I got my feet wet...

Feb 17, 2011 at 4:53 PM | Unregistered CommenterDavid

Coffee just met keyboard, but you're more like to find a tree sloth at rest than in a hurry


Feb 17, 2011 at 5:49 PM | Unregistered CommenterRobbo

So Schmidt, Curry and Pielke are lining up against these papers' findings.

Maybe the object was simply to provide headlines. I wonder who the reviewers were ...

Feb 17, 2011 at 5:56 PM | Unregistered Commentermatthu

Flood levels in the Loire basin seem to be declining, even allowing for the impact of the Villerest barrage, completed in 1982.

"The highest floods at Orleans took place in June 1856 (maximum height 7.1 m), in 1866 (6.2 m) and in 1846 (6.78 m). The highest flood of the XXth century was that of 1907 (5.25 m). Previously, those of 1707 and 1790, of similar heights, were the record floods.",les-crues,86593.html

Feb 17, 2011 at 6:30 PM | Unregistered CommenterDreadnought

If anybody can stand it, the papers were discussed on Material World this afternoon. All nonsense (evidence from models) and political alarmism.

Feb 17, 2011 at 6:54 PM | Unregistered CommenterPhillip Bratby

One can read Kipling's 'The Land' for a picture of how the British landscape has been managed since Roman times.

(OK, neither scientific nor great poetry, but still...)

Feb 17, 2011 at 9:01 PM | Unregistered CommenterDR


Go raibh maith agat.(Thank you in Gaelic)

Feb 17, 2011 at 9:03 PM | Unregistered CommenterDon Pablo de la Sierra

From JoNova's site quoting The Australian:

LABOR has cut budget estimates to meet the cost of future natural disasters while simultaneously arguing that climate change is increasing the frequency of floods and cyclones.

Budget documents show Labor has allocated $80 million a year for the next three years — $23m less than in the last Howard budget and far less than the $524m spent last year.

Go figure!

Feb 17, 2011 at 9:13 PM | Unregistered Commentermatthu

Go figure. Numbers speak louder than words, i guess.

Feb 17, 2011 at 9:28 PM | Unregistered CommenterMartyn

Don Pablo

De nada ;-)

Feb 17, 2011 at 9:41 PM | Unregistered CommenterBBD

Interesting that the 'News and Views' comment in Nature by Richard Allen is accompanied by a photo of the floods in York in 2000.

These floods reached a record high. They beat the previous record by <B>one inch</B>. That record was apparently set at the height of AGW in <B>1625!</B>

Feb 17, 2011 at 9:41 PM | Unregistered CommenterDave Andrews

I visited Bodiam castle in about 1986 (you're right--delightful).
Too bad I didn't record the height of the river; we could compare to your recent observation, extrapolate linearly--no, exponentially! (why scrimp?)--and precisely predict the date of its watery doom.

Feb 17, 2011 at 10:40 PM | Unregistered CommenterDave Bob


Don't wish to sound nit-picking but I think you will find that the "Muckle Spate" to which you refer was in August 1829. It affected all rivers in NE Scotland not just the Findhorn. In fact it was the first of a series of floods that affected the area throughout the 19th century. Hydrologists have long observed that the flood record for most UK rivers includes intermittent "flood rich" periods. These have been tentatively linked to NAO. Stick that in a computer and model it.

Feb 18, 2011 at 3:36 AM | Unregistered Commenteremckeng

I happened to be in the UK at the start of the 2000 floods, and drove from Bibury in the Cotswolds through York on the way to Newcastle. There was certainly a lot of flooding, but there is no way that I can have any confidence in model reconstructions to compare these or the present with the past, when so much of the dynamics of floods depends upon (a) rainfall in specific catchments and (b) the nature of the soil, roads, houses, stormwater systems, abstractions, etc in each catchment. Even if they have (a), they can only guesstimate (b) and that's where all kinds on subjective factors can find their way into models.

Feb 18, 2011 at 7:24 AM | Unregistered CommenterAynsley Kellow

It transpires that the uncertainties in the science overwhelm the certainties of the headlines.

This was made-to-order AGW science by Nature. Not done to inform but to mislead.

How embarrassing for all concerned, especially Myles Allen and Gabriele Hegerl who are supposed experts on attribution and uncertainty.

Feb 18, 2011 at 9:25 AM | Unregistered CommenterMac

emckeng - thanks for the correction - the 'muckle spate' on the Findhorn being 1829 not 1869. I have just had a look again at my source, the Institute of Hydrology / Tay River Purification Board report (A.R. Black and J.L. Anderson), 'The Great Tay Flood of January 1993' and noticed that the 1771 Tyne flow was estimated by Archer to be an astonishing 3900 cumecs at Hexham from a catchment area 1970km2. Black and Anderson refer to: Archer, N (1993) Discharge estimate for Britain's greatest flood: River Tyne - 17th November 1771. Proc. Fourth Nat. Hydrol. Snyn. Cardiff September 1993 pages 4.1-4.6.

The August 1971 Findhorn flood peaked at 2410cumecs at the Forres gauge (catchment 782km2), "a mere 17% of the catchment area of the Tay at Ballathie", and the peak flow of the 1829 muckle spate was reckoned to have been higher than this. So there is plenty of hydrological evidence for historical extreme rainfall events of greater magnitude than anything we have seen in recent years.

Feb 18, 2011 at 11:26 AM | Unregistered Commenterlapogus


Good stuff. Thanks.

Feb 18, 2011 at 11:50 AM | Unregistered CommenterBBD

emckeng - I didn't pick up on the point you made but agree also. It has long been recognised that big floods on the Tay tend to come in a run, and there can then be many intervening decades without any significant events. For example the 1989, 1990 and 1993 events came as a big shock to the hydrologists in Perth who had largely based their flood return period calculations on the river gauge data - which only went back to the 1950s at Ballathie and the 1970s for most of the tributaries, not long at all. Older hydro-electric engineers who could remember the much wetter decades prior to the 1950s were not so surprised at all by the floods in the 1990s.

As you say the NAO link could well explain this pattern of dry and wet decades. I think there is also going to be a significant random element: a big rain system normally moves across the country from south-west to north-east, such that the rainfall is usually spread across 2 or 3 different catchments. I think one of the things that made the Cumbria event unusual was that it just sat over the county for a few days, while further north and east remained much drier. I suspect that the Nature paper authors have underplayed this effectively random element of the weather in their models.

Feb 18, 2011 at 12:13 PM | Unregistered Commenterlapogus

Here is a spatial analysis of trends in the UK climate since 1914 using gridded datasets, published by the Met Office in 2006.

Take a look at Map 8: Gridded trends for days of heavy rain ≥ 10 mm, showing the change (in days) from 1961 to 2004 for autumn (page 27).

There is no trend for periods of heavy rainfall during autumn across the UK.

It does seem very difficult to associate a complete lack of any trend in heavy autumn rainfall as recorded and published by the Met Office in 2006 (real data) and the conclusion reached by the authors of the Nature paper that greenhouse gas emissions can now be blamed for increasing the odds of actual UK floods occurring during the autumn of 2000 (based on a model).

How does no recordered trend in a dataset translate itself into a higher probability of an event happening within the same dataset?

Feb 18, 2011 at 2:02 PM | Unregistered CommenterMac

I know we were talking about local UK and latterly Scottish precipitation and flooding, but this had me wondering about the global trend. There may well be more out there, but I found this paper very interesting:

Assessment of Global Precipitation (A Project of the Global Energy and Water Cycle Experiment (GEWEX) Radiation Panel GEWEX, World Climate Research Program, WMO) Gruber & Levizzani (2006)

From the Executive Summary:

Chapter 3 provides an excellent review of the global mean precipitation and its spatial and temporal distribution. The analysis is based on the 25 year period 1979-2004, which exhibited a global mean of 2.61 mm day-1. With regard to this value the authors of Chapter 3 provide an estimate of the uncertainty of ± .03 mm day-1. They point out that at this level of uncertainty there is no significant mean annual cycle in global precipitation. This is consistent with global energy arguments that to a first approximation global precipitation should be more or less constant over the 25 year period. The mean annual cycle over the oceans and land are examined separately with the land areas showing the largest annual variation. Analysis of the spatial and temporal distribution of precipitation demonstrated that this data set is very capable in capturing the ENSO, the major inter-annual variation in precipitation, most evident in the tropics but also influencing mid-latitude regions. However, no relationship was found between global precipitation anomalies and ENSO.

The situation is not as clear with regard to longer period variations, especially since as noted in Chapter 3 that this data set was not designed for trend analysis. Also, as noted in Chapter 3 the analysis indicated that there was no discernable trend in global averaged precipitation. However, this does not preclude the existence of regional trends. Analyses were presented that indicate small areas of linear trend over land and the Indian and central to eastern Pacific Oceans. Note that, however, these data seem to suggest that the rainfall shifts between the 1982/83 and 1997/98 ENSO. A similar result was obtained using an EOF analysis which isolated the ENSO regime (modes 1 and 2) from the lower frequency variations (mode 3). Also, a recent analysis suggested that there were positive trends in the frequency of upper and lower amounts of precipitation but compensated by a negative trend in the frequency of intermediate amounts. Nevertheless, these trend calculations are very sensitive to the length of record and it was felt that with an increase in the GPCP record length questions concerning longer period variability and trends can be answered with greater confidence.

See section 3.4 (take a look at figs 3.9 and 3.10)

This global precipitation data is presented with the proper caveats, but shows no global trend in precip that I can see.

But look in AR4 WG1 and you find this: Global Precipitation Changes

The increased atmospheric moisture content associated with warming might be expected to lead to increased global mean precipitation (Section Global annual land mean precipitation showed a small, but uncertain, upward trend over the 20th century of approximately 1.1 mm per decade (Section and Table 3.4). However, the record is characterised by large inter-decadal variability, and global annual land mean precipitation shows a non-significant decrease since 1950 (Figure 9.18; see also Table 3.4). Detection of external influence on precipitation

Mitchell et al. (1987) argue that global mean precipitation changes should be controlled primarily by the energy budget of the troposphere where the latent heat of condensation is balanced by radiative cooling. Warming the troposphere enhances the cooling rate, thereby increasing precipitation, but this may be partly offset by a decrease in the efficiency of radiative cooling due to an increase in atmospheric CO2 (Allen and Ingram, 2002; Yang et al., 2003; Lambert et al., 2004; Sugi and Yoshimura, 2004). This suggests that global mean precipitation should respond more to changes in shortwave forcing than CO2 forcing, since shortwave forcings, such as volcanic aerosol, alter the temperature of the troposphere without affecting the efficiency of radiative cooling. This is consistent with a simulated decrease in precipitation following large volcanic eruptions (Robock and Liu, 1994; Broccoli et al., 2003), and may explain why anthropogenic influence has not been detected in measurements of global land mean precipitation (Ziegler et al., 2003; Gillett et al., 2004b), although Lambert et al. (2004) urge caution in applying the energy budget argument to land-only data. Greenhouse-gas induced increases in global precipitation may have also been offset by decreases due to anthropogenic aerosols (Ramanathan et al., 2001).

Nothing very concrete to go on, is there?

Feb 18, 2011 at 5:53 PM | Unregistered CommenterBBD

Remember "Fate of the World" "praised by gaming experts and climate campaigners as a way of reaching new audiences in the fight against carbon emissions." The world is in danger due to our use of fossil fuels since the industrial revolution and only drastic cuts will save the planet. The final solution? Too many people, so introduce a deadly virus to bring the numbers down.

"Fate of the World is a dramatic global strategy game that puts all our futures in your hands. The game features a dramatic set of scenarios based on the latest science covering the next 200 years. You must manage a balancing act of protecting the Earth's resources and climate versus the needs of an ever-growing world population, who are demanding ever more food, power, and living space. Will you help the whole planet or will you be an agent of destruction?"

"The latest science" was from Dr Myles Allen of Oxford University.

Allen is a Lead Author on the next IPCC assessment AR5, WG1, Chapter 10: Detection and Attribution of Climate Change: from Global to Regional. His co-author for this Nature paper, Peter Stott of the Met Office, is the Co-ordinating Lead Author for the same IPCC chapter and was his co-author for the 2003 Heat wave claims. Stott is Head of “Climate Monitoring and Attribution” at the Met Office. Now if your job is to attribute any changes in weather to human induced global warming, isn’t that exactly what you are expected to come up with?

In 2007, Allen said in an interview that "The Green movement has hijacked the issue of climate change. It is ludicrous to suggest the only way to deal with the problem is to start micro managing everyone, which is what environmentalists seem to want to do."

However he was actually receiving funding at the time from WWF, for the "Seasonal Attribution Project”, a Climateprediction sub-project. This was in order to try to determine the extent to which extreme weather events are attributable to human-induced global warming.

He told the BBC World Service's Discovery programme in 2003, "The vast numbers affected by the effects of climate change, such as flooding, drought and forest fires, mean that potentially people, organisations and even countries could be seeking compensation for the damage caused. "It's not a question we could stand up and survive in a court of law at the moment, but it's the sort of question we should be working towards scientifically"

"Some of it might be down to things you'd have trouble suing - like the Sun - so you obviously need to work how particularly human influence has contributed to the overall change in risk," he said." "But once you've done that, then we as scientists can essentially hand the problem over to the lawyers, for them to assess whether the change in risk is enough for the courts to decide that a settlement could be made."

“This next decade is going to see quite a lot of climate change cases around the world”, said environment lawyer Peter Roderick, who runs the Climate Justice Program for Friends Of The Earth International.

The other main players in this paper are Risk Management Solutions and Piers Corbyn points out their previous history on flood insurance issues, relating to New Orleans.

“This is the same Risk Management Solutions which was caught Green-handed inserting a misleading graph into the UN’s Climate Committee (the IPCC)”

Risk Management Solutions President, Hemant Shah, has been listed as one of the re-insurance industry's Top 40 Most Influential (Global Reinsurance, 2008). He is a Director of the RAND Center for the Study of Terrorism and Political Violence, a Director on the Board of RAND's Institute for Civil Justice, and a Director of the Singapore-based Institute for Defense and Strategic Studies. Shah is a member of the Aspen Institute's Henry Crown Fellowship Program.

The RMS chief finance officer, Stephen Robertson, previously worked for Deutsche Bank and Lehmann Brothers.

Robert Muir-Wood is CRO of Science and Technology Research at RMS and was a lead author on Insurance, Finance, and Climate Change for the 2007 IPCC Fourth Assessment Report.

What we have here then, is a stealth marketing exercise for the re-insurance industry, dressed up in the guise of a scientific paper and written by IPCC scientists and fellow travellers. And they wonder why people won’t believe them.

Feb 18, 2011 at 6:06 PM | Unregistered CommenterDennisA


You've probably already seen this, but if not, read and weep:

Feb 18, 2011 at 6:27 PM | Unregistered CommenterBBD

Just to fling another statistic into the mix. Different River Tyne ...

The flooding of the River Tyne has always proven a problem in Haddington. Records show a devastating flood in the 14th Century A plaque on the corner of the High Street and Sidegate shows the height of the flood in 1775 but worse than that was in 1948 when the river rose more than 3m (10 feet) above its banks and flooded 450 homes.
For anyone not familiar, Haddington is the county town of East Lothian.

Feb 19, 2011 at 2:00 PM | Unregistered CommenterSam the Skeptic

It would appear over at WUWT that Willis Eschenbach has driven a coach and horses thru the nature publication, “Human contribution to more-intense precipitation extremes” by Seung-Ki Min, Xuebin Zhang, Francis W. Zwiers & Gabriele C. Hegerl

Willis summarises as follows;


1. They have neglected the uncertainties from:

• the bad individual records in the original data

• the homogenization of the original data

• the averaging into gridcells

• the incorrect assumption of increasing correlation with decreasing distance

• the use of a 3 parameter fitted different probability function for each gridcell

• the use of a PI average on top of a weighted raw data average

• the use of non-Gaussian data for an “optimal fingerprint” analysis

• the conversion of the model results to the HADEX grid

• the selection of the models

As a result, we do not know if their findings are significant or not … but given the number of sources of uncertainty and the fact that their results were marginal to begin with, I would say no way. In any case, until those questions are addressed, the paper should not have been published, and the results cannot be relied upon.

2. There are a number of major issues with the paper:

• Someone needs to do some serious quality control on the data.

• The use of the HADEX RX1day dataset should be suspended until the data is fixed.

• The HADEX RX1day dataset also should not be used until gridcell averages can be properly recalculated without distance-weighting.

• The use of a subset of models which are selected without any ex-ante criteria damages the credibility of the analysis

• If a probability-based index is going to be used, it should be used on the raw data rather than on averaged data. Using it on grid-cell averages of raw data introduces spurious uncertainties.

• If a probability-based index is going to be used, it needs to be applied uniformly across all gridcells rather than using different distributions a gridcell by gridcell basis.

• No analysis is given to justify the use of “optimal fingerprinting” with non-Gaussian data.

3. Out of the 731 US stations with rainfall data, including Alaska, Hawaii and Puerto Rico, 91% showed no significant change in the extreme rainfall events, either up or down.

4. Of the 340 mainland US stations with 40 years or more of records, 92% showed no significant change in extreme rainfall in either direction.

As a result, I maintain that their results are contrary to the station records, that they have used inappropriate methods, and that they have greatly underestimated the total uncertainties of their results. Thus the conclusions of their paper are not supported by their arguments and methods, and are contradicted by the lack of any visible trend in the overwhelming majority of the station datasets. To date, they have not established their case.

As I said about the Myles paper, "How does no recordered trend in a dataset translate itself into a higher probability of an event happening within the same dataset?"

It raises an even bigger question - how do such badly flawed papers of climate science find their way on a regular basis into Nature publications?

Confidence in science journals is ebbing away fast.

Feb 21, 2011 at 9:48 AM | Unregistered CommenterMac

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>