Wednesday
Jan222014
by Bishop Hill
Still a standstill
Jan 22, 2014 Climate: Surface GWPF
David Whitehouse, writing at the GWPF, notes the release of 2013 surface temperature data from NOAA and NASA. Depending on your predelictions this can be headlined as "Fourth hottest ever!!!" or, as I have done "Still a standstill".
When asked for an explanation for the ‘pause’ by reporters Dr Gavin Schmidt of NASA and Dr Thomas Karl of NOAA spoke of contributions from volcanoes, pollution, a quiet Sun and natural variability. In other words, they don’t know...
Given that the IPCC estimates that the average decadal increase in global surface temperature is 0.2 deg C, the world is now 0.3 deg C cooler than it should have been.
Reader Comments (93)
Micky H Corbett.
On the Exploring the fascist borderline tread I correct some erronious statements by Radical Rodent using current NCDC data and Marcott et al.
The NCDC temperature for 2013 quoted confidence limits of +/- 0.09C. Marcott et al's data is +/- 0.2C.
Proxies do have lower resolution, but not as bad as you claim.
Note that Marcott et al had 61 citations last time I looked and is becoming the standard reference for Holocene temperatures. Those using the pre-20th century data find it a good match to reality, which is what counts.
The spin-sceptic propoganda sites naturally loath it, mainly because even the lowest res data at the end of the sequence showed 20th century warming. Disappointing to find a hopefully intelligent man spouting their stale propoganda.
palantir
6810^21 Joules/year is the slope of the graph I cited, taken from 1995 onwards. I got it off the graph, as you can.
The other figures are in the literature. Some can also be calculated yourself if you've A Level Physics.
@ Entropic Man 2.15 - you didn't answer my first question - where does the data come from? The Argo sensors only started working about 10 years ago, so where's the early numbers from?
EM, in response to your answer to my point on “Exploring the fascist borderline”: have a look here at an interesting graph. Notice something about the MWP, RWP & Minoan WP? (Note the data source.) If you want, you can always go here, where you can get the data to make your own pretty picture. To avoid the cry of, “Cherry picking!” go to the other end of the world. (Go here for explanations.)
Now, answer palantir’s question as to where the data came from – how on Earth can anyone claim that deep sea temperatures are rising when they have NO IDEA what they were less than ten years ago, and only a little more idea, now!
Palantir
The first serious deep ocean temperature measurements were taken by HMS Challenger in 1872. Spot measurements from a variety of vessels and towed temperature recorders were the standard for most of the 20th century. Below 2000m that is still the standard technique.
After the 1982/83 El Nino did considerable damage NOAA set out to improve its data using the Tropical Atmosphere Ocean Array. This monitored temperatures to 500m and is still working, just.
http://celebrating200years.noaa.gov/datasets/tropical/welcome.html
21st century data comes from the ARGO floats.
The graphs were drawn privately using the NOAA datasets.
Radical Rodent
Fascinating the way all the sceptics quote the Lappi graph based on one Greenland ice core and the Vostok data based on one core, while rejecting Marcott et al which includes both in its 70 proxy dataset.
Can you supply a conversion table turning the low values on the y axis of the Lappi graph into global temperatures comparable with other sources? The scale shows temperatures around -31C, which seems a bit low. I hope you are not seriously suggesting that global temperatures were 45C cooler 2000 years ago :-)
Lappi kindly included a regresssion line, but no confidence limits. There is no way to tell whether the spikes you refer to are real, or just noisy data.
Could you suggest a link to a temperature graph for the Vostock core? I have no inclination to spend hours wading through readme files.
notice how EM refuses to say anything about the validity of those oceanic temperature measurements. The poor deluded soul cannot bear to admit that the error bounds are such as to make any assertions useless. Priceless. Bigotry in action.
Diogenes
For discussion of the validity of the ARGO data, try here.
ceres.larc.nasa.gov/documents/STM/2013-10/14_Global_averages.pdf
Pease note that the manufacturer's specification describes the temperature sensors as accurate to +/- 0.002C.
Entropic Mann,
And if you had one in your front room with its claim of being able to measure to ±0.002°C, would you be confident that it told you the mean temperature of your room to within ±0.002°C? Though not. Good luck with 1.347 billion cubic kilometres of active ocean currents then.
Hubris or just plain stupidity?
Oh God/Gaia, EM is off again. A few of us others went through all this on page 3, 4 and 5 of the recent /falsifiability-in-my-lifetime thread. EM actually pops up on page 2 but you probably won't want to go back that far.
Before this it was the last 2 pages of the /inside-mathematics thread, where EM amazingly stated that "Modern temperatures are approaching those of the Holocene optimum". EM thinks that the Lappi graph peaks for the Holocene Optima, Iron Age, Roman and MWP were all weather, which, in a 10,000 year Holocene context is probably true Vostok, 425kya - present, NOAA (enjoy the Holocene while it lasts), but for some in explicable reason, EM believes that the 20th Century warming peak is different, and that we are all going to fry and die before tea-time, because of a little extra CO2. So I now have a mental image of EM - and think he is most probably the one third from the right?
Note - I don't know who annotated all the civilisations (and I would argue that the peak at 1500-1000BC should be labelled Bronze not Iron Age, at least in NW Europe), but the scale on the Lappi graph I link to above has been adjusted from the actual -30C ice core temps to correlate with the global temperature anomaly. Irrc when I appended the Hadcrut4 data (which I did by eye in an image editor from a WoodforTrees graph) the figures for 1920 were within 0.1C of each other.
mikeh
Re: missing heat
First we had teleconnection. Now we have teleportation.
I've learned 2 things in the last week or so and I'm going to tip my hat to AlecM/mydogsgotnonose as his posts sparked my curiosity. Plus Willis' post on WUWT about CERES data.
1) the famous downwelling radiation of 333W. Which is equavalent to a BB surface radiating at a temperature of at least 3 degrees (if emissivity were 1). So we are being heated by a magical layer of air at around 2.5 km up? hmm. What about the lapse rate? What about all the hotter layers below which according to AGW are supposed to contribute. That's the first instance of teleportation and absurdity. I've seen papers on downwelling measurements using pyrgeometers and they all seem to be almost equivalent to ambient temperature. Maybe you need two: one as control and the other as the main and use the difference?
2) the outgoing LW radiation of the Earth at different latitudes as in here. The question asks why do the dips become bumps? What interests me more is that the emission spectrum from CO2 hardly changes moving from the equator to Antarctica. It looks as though CO2 emits at an equivalent BB temperature of -50 degrees.
EM
To answer your question: if Marcott and the rest can reproduce temperatures to +\- 10% of absolute temperature using tree data and show that this matches an experimentally obtained relationship in controlled conditions only then can you crow.
I've put satellites in space and written the control for space engines. Go look up the GOCE satellite, the one that measured the Earths gravity gradient recently. Look at the ion thruster control algorithm. I was the main person who wrote it and tested it. We used a lot of emulations and simulation. Which meant deriving accurate proxies and essentially paramitisations. If any accuracy was above 10% we threw it out and tried to improve it I.e. more tests, minimising variables.
Tree ring proxies might as well be a wet finger in the air. Other proxies don't fair much better even dO18. There are still lots of uncertainties. As a scientist this fact would be front and centre. Like I said good for theory but not actual evidence at all.
lapogus, Micky H Corbett
As I have said on other threads, save your keystrokes. Whatever you type is unlikely to affect EM's views.
EM is a naive man who has swallowed the CAGW story whole. He'll believe anything that seems to back it up, no matter how discredited the source, and if he can't find it, he'll make it up. From that point on, his own BS becomes his reality.
EM - I am not disputing the accuracy of the ARGO buoys - another instance of you proving unable to read. I will point out, however, that there are so few ARGO buoys in number for the volume of water they are measuring that the error bars on the global averages must be enormous. Which in turn means that all that historical data must have error bars that are stupendous.
Martin A
Yes I'm starting to realise that.
Micky H Corbett
Respct for the GOCE work!
We could argue endlessly about the mechanics of the greenhouse effect.
You are right about the low temperature. Most of the energy in the OLR radiates from the top of the troposphere
and from the stratosphere. If memory serves, the black body curve for the Earth is equivalent to 255K, the top of the troposphere. The CO2 emmission comes mostly from the stratosphere, and would be colder.
It is not usually practical to grow entire trees for their lifetime in greenhouses to derive "an experimentally obtained relationship in controlled conditions." In practice you have to take naturally growing trees and match ring growth to recorded observations from nearby weather stations. You learn to recognise the effect of weather or climate changes on ring growth,
.
Once that is achieved, one can do it the other way, deducung weather and climate from tree ring data.
diogenes, Radical Rodent
Your sample does not have to be complete, just big enough.
I havent used them in years, but vaguely remember two techniques to determine how big your samples need to be.
Take a number of small samples from your data. Calculate their individual means and standard deviations.
Using a samole made up of the other sample means, calculate its SD.
If the individual SDs are larger than the combined SD, the samples are too small. Repeat with larger samp;es until the individual SDs and the SD for the combined means are equal.
Alternatively, take a series of samples of increasing size and calculate their SDs. As sample size increases the SD should decrease. Eventually you reach a sample size at which the SD stops decreasing.
In both techniques you are increasing sample size until you lose the variability due to small sample size and are left with the variability of the property or population being sampled.Once you reach that point any further increase in sample size will make no further difference to the quality of your data.
Like I said.
EM
Because it's hard to produce a growth relationship you shouldn't use a tree ring as a proxy. That's the way it is. I think a few people have got ahead of themselves with all these reconstructions. We really don't know enough to any decent level of accuracy - and in fact we aren't going to know enough until said experiments are done.
That's why I don't put much value beyond theory in Mann and Marcott's work. It's a bit like string theory. It all sounds good but you can't test it to any level of accuracy so in practical terms it's next to useless.
It's a pity that the people who write policy don't seem to realise this obvious fact.
Micky H Corbett
There's more to proxies than just tree rings.
In the 1970s I did temperature profiles using pollen counts in peat cores. Carbon dating gave the age and the pollen in the peat gave the local ecology, the species present. Knowing the upper and lower temperature limits for each species constrained the environmental temperature.
To save me writing at great length, go here for a summary of the half dozen main proxies.
http://www.sciencemuseum.org.uk/climatechanging/climatescienceinfozone/exploringwhatmighthappen/2point1/2point1point3.aspx
http://judithcurry.com/2014/01/21/ocean-heat-content-uncertainties/#more-14367
Judith Curry reckons your fixation with the heat disappearing down the black hole of death is, to be polite EM, a pile of bollocks.
Ok those aren't her exact words but you get the drift.
Btw, data prior to 2000 = sparse...to be polite.
Anyway, let's look on the bright side. At least EM isn't telling us fracking is uneconomical! :)
Mailman
EM - please can you cite a reference for your work, I'd be interested to read it.
//
In the 1970s I did temperature profiles using pollen counts in peat cores. Carbon dating gave the age and the pollen in the peat gave the local ecology, the species present. Knowing the upper and lower temperature limits for each species constrained the environmental temperature.
//
Jan 24, 2014 at 6:00 PM | Unregistered CommenterEntropic man
What on earth was a school teacher doing pollen counts of peat cores and carbon dating in the 1970's???
If you don't mind me saying: you are full of sh*t.
EM disappoints as ever - anybody with any practical experience of applied botany knows how many factors other than temperature impact significantly on plant development. Mann tacitly admits this by using his bizarre 4th eigenvector in his corrupted principal components analysis - one that doesn't actually show any signal - which, of course fits the bill perfectly when your only objective is finding meaningless noise to average out into a straight line.
So, can those alleged pollen counts show us a proxy signal of a degree or so? Nope. Not a chance.
EM,
I agree there is a lot more to proxies than tree rings, but each one has significant problems of their own that limits their accuracy and precision, and sometimes even if they are applicable at all. I can write from experience here having worked for 25 years with stable isotopes and published more than 60 papers on aspects of isotope palaeoclimatology including carbonate palaeothermometry and ice thermometry. Three of the five proxies you referred to at the Science Museum web site are isotope proxies. Let's make a few brief notes.
1) Only high latitude ice (Antarctica, Greenland) has a robust link between isotope composition and temperature. However, even here there are issues. For example the modern spatial gradient in ice isotope composition is strongly correlated with temperature. However, it is not so certain what the temporal gradients look like. i.e. it is not very easy to take a core and robustly convert the isotope composition to temperature. We haven't even started to talk about changes in synoptic weather and precipitation patterns and how these might affect the ice isotope composition.
2) Corals and algae: The oxygen isotope composition of all carbonates, marine and freshwater is a function of two variables: i) the oxygen isotope composition of the water in which the organism lived, and ii) temperature. It is not possible a-priori to know the oxygen isotope composition of the water at the time any carbonate layer was deposited. Thus it is not possible to determine the temperature with any degree of precision. One might infer the limits within which the water isotope composition might have changed and then infer a temperature. However, your assessment of precision must take this into account.
3) Sediments and fossils: All the comments associated with 2) apply. One cannot easily decouple the effect of water isotope composition variability from temperature. This is especially severe when dealing with records over geologic time, or when working with freshwater archives. I'll give you an example, much of our knowledge of Cretaceous temperatures is determined through oxygen isotope studies. However we don't know the isotope composition of the Cretaceous ocean. We can make assumptions that it might be like the modern ocean, or perhaps there was no continental ice and the ocean was about 1 per mille depleted in 18-O compared to the modern ocean. Well between those two assumptions lies a 5 degree temperature range!
4) 60 years after the pioneering work of Urey and co-workers we still don't know the functional relationship between temperature and the oxygen isotope fractionation between water and carbonate. Just check some of the recent literature. If I took the whole spread of experimental and theoretical determinations then we are looking at a range of temperature of getting on for 8 degrees across the envelope of all the available calibrations, possibly greater.
5) Your own work on pollens and mutual climatic range is interesting but just how accurate and precise an estimate of climatic variables does it produce? Look at the number of variables: temperature, seasonal range of temperature, total precipitation, seasonality of precipitation, evapo-transpiration, insolation etc. and the magnitude of the difficulties one is faced with start to emerge.
Don't get me wrong. I think there has been some very good work done with palaeoclimate and Earth surface temperature estimates but one has to have ones eyes completely open to the many issues that abound and very aware of the limitations associated with each proxy. The precisions being quoted by many studies are just not realistic.
Not banned yet
I was doing legwork for Peter Moore and Francis Rose at the Kings College Plant Sciences Department in Herne Hill as a student, 1971-74.
This is the sort of thing going on at the time., though this is a much later edition
http://books.google.co.uk/books?id=GP5HeCwkV2IC&pg=PR4&lpg=PR3&ots=KcgsI5ZmEF&focus=viewport&dq=peter+moore+kings+college&output=html_text
Regarding accuracy, you could get resolution about +/- 1C, not bad 40 years ago. You could certainly distinguish between glacial and interglacial conditions . I also remember great excitement about the Elm Decline as evidence for Neolithic settlement in the UK as things warmed up.
http://onlinelibrary.wiley.com/doi/10.1111/j.1469-8137.1981.tb04101.x/abstract
Not banned yet
I was doing legwork for Peter Moore and Francis Rose at the Kings College Plant Sciences Department in Herne Hill as a student, 1971-74.
This is the sort of thing going on at the time., though this is a much later edition
http://books.google.co.uk/books?id=GP5HeCwkV2IC&pg=PR4&lpg=PR3&ots=KcgsI5ZmEF&focus=viewport&dq=peter+moore+kings+college&output=html_text
Regarding accuracy, you could get resolution about +/- 1C, not bad 40 years ago. You could certainly distinguish between glacial and interglacial conditions . I also remember great excitement about the Elm Decline as evidence for Neolithic settlement in the UK as things warmed up.
http://onlinelibrary.wiley.com/doi/10.1111/j.1469-8137.1981.tb04101.x/abstract
Judith Curry still thinks you're all teko EM.
Mailman
Ps. Kiwis will know what I mean :)
EM
'Regarding accuracy, you could get resolution about +/- 1C, not bad 40 years ago.'
Herein lies the problem. First the language is inexact. Do you mean accuracy, precision or resolution? Secondly the +/- 1C means that you are trying to estimate late Holocene climate change with a proxy which has a resolution/precision (or whatever) that is on the same order as the magnitude of the variability you are trying to measure. Resolving glacial from interglacial conditions is one thing (Delta T = 5 degrees C), resolving Holocene temperature variations is very much more difficult to do.
I was going to add a reply to EM but Paul Dennis beat me too it. I'm just going to add that I don't believe that you can even get a 1 degree resolution /accuracy for a biological or chemical proxy. It's hard enough getting that accuracy for measured temperature in the field.
I'm often astounded by how little people in the scientific field know or have experienced with actual metrology. Getting accurate and precise measurements can take months if not years and even then...
Paul Dennis
Dr Dennis, I certainly cannot match your expertise!
Like yourself I've been following the science of climate change over a lifetime. I've watched the early discussions of glaciation and the Camp Century cores firm up into a more rigorous description of Holocene climate and glacial cycles. I remember Hansen's 1 D model and the fuss and hype from the Greens that followed.
I'm well aware that all the paleoclimate data is uncertain, as all such data must be. Uncertainty is not the problem.It is the scale of that uncertainty on which we disagree.On Cretaceous temperatures, for example, there are clues to ice cover in the configuration of the continents.With open ocean over the North Pole and taiga and tundra vegetation almost to the South Pole there is not room for much ice. The high sea level suggests the same. You are perhaps overly pessimistic.
What most impressed me over the years has been the way a wide variety of past and present data from a wide range of sciences has built up into a pattern. Individual proxies or other lines of evidence may be uncertain. The more vocal sceptics even take the nihilistic view that unless evidence is exact, it is useless. There are still devils in the details and puzzles like the mismatch between paleo-based and modern estimates of climate sensitivity. This is normal science. If we knew all the answers, where would be the fun? ;-)
Nevertheless, plate tectonics, vulcanism, orbital mechanics, modern atmospheric physics, insolation, the temperaure/CO2 interaction and the odd impact together build up a paradigm which works to describe the geological, biological and climatic behaviour of the planet across a billion years. It also works for Mars and Venus, and even Titan.
The way it fits together is what I find compelling. No alternative paradigm of planetary function comes close.
Mailman
After reading some of her evidence to Congress, I am much less impressed with Dr Curry. It was a considerable disappointment to see her compromise her scientific integrity by misrepresenting AR5 data.
Micky H Corbett
As an engineer you have the luxury of precise measurement under controlled conditions. Out in the real world you make the most of what you can get. Talk to the scientists interpreting your satellite data.
I mentioned the tendency of spin-sceptics towards extremes, where the data is black or white, perfect or useless.
In practice, all is shades of grey. If you wait for perfect data you will wait forever .To paraphrase Isaac Newton's fourth law of inquiry, "Make the best use of the data you have."
Richard Whybray, your pomposity and arrogance knows no bounds.
Hello again, bureaucrat.
Scared to tell me your name?
EM
You chose the wrong example:
The geoid from the GOCE satellite is the most accurate and highest resolution picture of the Earth's gravity field that has been produced - actually better than GRACE although the time scale variations are different.
Also no space thruster has produced the thrust noise resolution that the T5 thruster and system (the Ion Propulsion Assembly) - IPA. The actual mission was ten times better than qualification measurements, done on the actual thruster system mounted in a vacuum chamber. The thrust unit test was calibrated for real thrust using a balance developed and verified by National Physical Laboraties and at the time was the most accurate balance of its kind to measure a device weighing a few kilos but producing sub milli Newton thrust.
Other highlights: the thrust system and satellite were so accurate that GOCE became the only ever seismometer in space, actually measuring atmospheric drag fluctuations resulting from the Japanese earthquake.
The thrust control itself operating at 100 Hz is arguably the fastest controlled space engine today, worked more or less continously for over 3 years continously compensating for drag.
And at all stages we couldn't afford not to measurement and characterise the device to the highest accuracy possible as with all space missions, margins were tight and at times requirements were absurd.
I haven't even mentioned the magnetic characterisation, erosion of grids, creating sputter yield curves, doing plasma measurements, beam profiles, getting mass flow rates accurate to almost the nanogram. Crazy detail and all to get a mission into space.
So yes sometimes the real world isn't as accurate. For GOCE it was at least 10 times more accurate. So much so all the hassle of developing a detailed algorithm for the drag compensation seems a bit like wasted effort.
Micky H Corbett
When the scientific analysis of the GOCE data is published it will include discussion of the uncertainties, and confidence limits. These will be smaller than for previous measurements, but still present. This is true of all improving measurement technologies, from weather stations to deep ocean thermometers.
Your natural pride in the technology of the satellite would seem to be blinding you to its limitations. You have been able to produce extremely precise technology to control GOCE, but its data is subject to the same uncertainties as any other science.
EM can we return to the Cretaceous. It's a very interesting period in Earth history. Let's just take the Cenomanian. It's widely stated that it was ice free yet there are many geologists, geochemists who would tell you they are not so certain. Carbon isotope excursions and sequence stratigraphy that correlates across and between different geographic regions suggest these are global events driven by eustatic sea-level changes with a periodicity that is consistent with obliquity changes in Earth's orbit. It's hard to understand such changes in tectonic terms and many would suggest global continental ice volume is changing through these periods. If this is the case then ocean isotope composition will be changing and temperature estimates based on oxygen isotope thermometry will need to be revised. I'm not saying the overall picture will change such as a global cooling during the mid- to late-Cretaceous but the details and magnitude are not known with precision. For example, even if we knew the bulk isotope composition of the ocean, we don't know how ocean isotope composition varies spatially throughout the Cretaceous. The modern ocean surface and near-surface waters show a more than 1 per mille variation in d18O. This translates into a more than 4 degree range in estimated temperature if we were to assume it is constant. So how can we truly begin to estimate equator-pole temperature gradients with any certainty, or have any understanding of the water cycle? Interestingly on Monday, with a colleague, I submitted a grant application to begin to address some of these questions.
Similarly with the Hirnantian glaciation at the end of the Ordovician. Ice volumes greater then the modern day glacial periods with a CO2 level more than 15 times higher than the present day. It's hard to fit such observations into a paradigm which says that CO2 level is a very fine thermostat.
Anyway we've moved very far from the topic of this thread so the best I can do EM, and any others, is to say if you ever find yourself in Norwich, or about to visit Norwich then feel free to contact me and visit my lab. I'll be more than happy to show you round, talk about isotope palaeoclimatology etc.
EM,
I'm sure a real scientist, such as Curry, isn't going to be losing any sleep over whether you think she's a bit smelly. You are in good company with that other esteemed critic, the geezer who calls for climate scientists to rise above name calling BUT isn't averse to being petty himself. Yes, the one and only Micky Mann!
Wait a second! EM is Michael Mann! I claim my £5!!!
Mailman
EM
Of course there will be uncertainties but that isn't the point of what I'm saying and those errors will be much less than before in terms of measurement. The gravity model will still have errors, but these will be lower because of better data.
And more importantly this improvement is larger than expected.
But the way that you get better data with lower errors and uncertainties is that everything that contributes to it has lower errors still. And that everything is characterised.
With your work on pollen that you quoted my immediate question would be what temperature measurements are being used in the relationship and how were they defined? if you are just using weather measurements based on thermometers in the field then you going to have large errors.
And looking at pollen proxies, it seems they are better at telling what type of plants were growing than what the temperature was. Certainly not to sub degree accuracy.
I think you are blinded by ignorance of what it would take to actually achieve a proxy relationship to the accuracy quoted in these papers which appears to be very similar to engineering quality. But of course they don't do that - they use statistics and assumptions. Imagine if we'd used that for everything in GOCE?
Paul Dennis
We're getting into territory way above my skill set, (retired Science teacher) so it is probably time to stop.
I'm not sure you can call CO2 a very fine thermostat, if you mean fine in the sense of very accurate or sensitive. I certainly do not regard it as the ultimate cause of all climate change.
It does seem to act as an amplifier for other variations. You see this in glacial/interglacial transitions; also in the transition out of snowball earth conditions, such as the vulcanism/CO2/temperature causation chain which may have warmed the Cambrian. Regarding the curious case of the modern period, I see nothing which fits the data better than cAGW.
Thank you for the invitation. I visit my parents near Cambridge a couple of times each year and hope to be in touch.
Good luck with the Ordovician research. I suspect you will be spending a lot of time on orbital mchanics and the effect of continental configuration and altitude on ice sheet formation, with a side order of solar variation and interstellar dust density. CO2 concentration and temperature might well end up as effects at the end of a long chain of other forcings.
Micky H Corbett
The annual average temperature range limits for many plants is well defined. The plants themselves are surprisingly sensitive to temperature differences, which is why the treeline on a mountainside or above a fjord is often so sharply defined.
The logic for a temperature determination might go like this. The lower temperature tolerance of species A is 6C. The upper temperature tolerance of Species B is 8C. If pollen from both species is present in the peat from a bog, the area around it experiences an annual average survivable for both species. This must be warmer than 6C and no higher than 8C.
Alas, scientific accuracy and engineering accuracy are very different. Field science is often done under conditions which make the finesse of your controlled workshop impossible. Imagine trying to build a thruster to the standards used for GOCE on the deck of a ship in a blizzard.
EM
Without going into detail:
How do you know the relationship for the plant is constant depending on nutrient and water availability and over the course of hundreds of years? If not you may have to add another degree each side to your temperature variation maybe more. Paul Dennis' posts show this - there are a lot of assumptions in the time series. This will increase uncertainty by its nature
Of course if the plants are extremely sensitive to temperature and you know this it looks like some characterisation has already taken place which is precisely my point. Then you can use your 6 to 8 degree range. Getting to a 6 to 8 degree temperature is a really good thing, it's impressive. It's still not sub degree accuracy though which is what the proxy studies claim.
With regards to a blizzard I can see what you are saying but again hardware is specified for such conditions. I also work in helicopter flight control. Space is often a lot worse than that. Launch alone is pretty horrific when you see the vibration levels. Yet in the lab you often try and exceed nature, you try and break, burn, blow up, your hardware so that you can then get it qualified for the real world where the probabilty of exteme things is predicted to be less. In the end it's all about probabilty risk and mean time before failure.
But look, it seems that we are actually converging to the same point: that better knowledge of things means more accuracy. My quibble and the quibbles of other posters here is the stated accuracy of proxies and subsequent use to get sub degree accuracy and yearly resolution is often not justified. And not justified on conceptual grounds as there are inbuilt assumptions.
Apart from that I'm happy to agree to disagree. My experiences have taught me that characterisation of relationships in a controlled way is the only way to improve extrapolations into the real world where there are more variables and influences. In fact systems are often only qualified if you know this.