Buy

Books
Click images for more details

Twitter
Support

 

Recent comments
Recent posts
Currently discussing
Links

A few sites I've stumbled across recently....

Powered by Squarespace
« The Royal Society and global warming | Main | Fiddling in the Antipodes »
Thursday
Nov262009

Smoking gun?

On the code thread, James Smith has just posted this comment:

From the file pl_decline.pro: check what the code is doing! It's reducing the temperatures in the 1930s, and introducing a parabolic trend into the data to make the temperatures in the 1990s look more dramatic.

Could someone else do a double check on this file? Could be dynamite if correct.

 

PrintView Printer Friendly Version

References (1)

References allow you to track sources for this article, as well as articles that were written in response to this article.
  • Response
    From the file pl_decline.pro: check what the code is doing! It's reducing the temperatures in the 1930s, and introducing a parabolic trend into the data to make the temperatures in the 1990s look more dramatic. - Recycled to a separate posting today by ClimateGate blogstar Bishop Hill from among the comments ...

Reader Comments (62)

There are 2 files of that name.
One in osborn-tree6\summmer_modes
&
one in ....tree6\mann\oldprog

Haven't got a Diff prog handy, so it's not easy to see if they are the same.

Both contain interesting comments -

; Now fit a 2nd degree polynomial to the decline series, and then extend
; it at a constant level back to 1400. In fact we compute its mean over
; 1856-1930 and use this as the constant level from 1400 to 1930. The
; polynomial is fitted over 1930-1994, forced to have the constant value
; in 1930.
;


; Now apply a completely artificial adjustment for the decline
; (only where coefficient is positive!)


Can't take it any further right now. I'm very rusty on this code and am pushed for time.

Hope that helps

Nov 26, 2009 at 12:48 PM | Unregistered CommenterView from the Solent

They are the same program but this is the newer one with additions for colour and the ability to run in various modes eg regression / variance and it does a lot more filtering.

.\documents\osborn-tree6\summer_modes\pl_decline.pro

the older program is here

.\documents\osborn-tree6\mann\oldprog\pl_decline.pro

The newer program

Nov 26, 2009 at 1:17 PM | Unregistered CommenterRocky

It's over 30 years since I worked with plotfiles, but it looks like here:

;
; Now fit a 2nd degree polynomial to the decline series, and then extend
; it at a constant level back to 1400. In fact we compute its mean over
; 1856-1930 and use this as the constant level from 1400 to 1930. The
; polynomial is fitted over 1930-1994, forced to have the constant value
; in 1930.
;
ycon=1930
declinets=fltarr(mxdnyr)*!values.f_nan
allx=timey(ktem)
ally=difflow
;
kconst=where(allx le ycon,nconst)
cval=total(ally(kconst))/float(nconst)
kkk=where(mxdyear le ycon)
declinets(kkk)=cval
;
kcurve=where((allx ge ycon) and (allx le 1994),ncurve)
allwt=replicate(1.,ncurve)
acoeff=[0.,1./7200.]
vval=curvefit(allx(kcurve),ally(kcurve),allwt,acoeff,$
function_name='funct_decline')
kkk=where(mxdyear ge ycon)
declinets(kkk)=vval(*)

They set a breakpoint at 1930 (ycon), then compute the mean below that point, assign it to the declinets array. Then from 1930 to 1994 (constant in code!) they do a curvefit to some curve, and assign the result to declinets. Don't know if said curve goes up or down, do not have the input data.
But yeah, there is something fishy going on there. Comment says its a second order polynomial, this indeed creates a sharp rising or descending curve. Why 1994 as an endpoint? For polynomial fitting over 'n' points you need at least n/2 points in the 'future' or you have to pad your data with (n/2)+1 datapoints pulled out of a hat. date of the code is 2004, so it looks like they use a 19 or 21-point smoothing.

Pjotr

Nov 26, 2009 at 1:18 PM | Unregistered CommenterPjotrk

Oh, BTW, I forgot to add: feel free to shoot me down in flames if I'm way out of my league... It *has* been 30 years since I wrote this kind of code myself.

Pjotr

Nov 26, 2009 at 1:25 PM | Unregistered CommenterPjotrk

Just a few comments on my interpretation of this code.

First, this goes against what RC claimed when they said "hiding the decline" was about the Briffa dataset - aren't these data associated w. Mann and Osborn? Or do they have papers relying on the Briffa dataset?

Secondly, what I think is done is creating a constant level that is used to offset the 1400 to 1933 data, and thereafter the data is offset by a 2nd polynomial to give a gradual, accelerating-style increase. The most interesting finding, however is that the code has 3 modes, controlled by a variable called doadjust

doadjust=1 ; 0=do not, 1=show some more results,
; 2=do go on and produce a new data set with adjustment for decline

Now, I am not certain if this is correct, but my take is that if you set 0, it just calculates and plots data, if you set 1, it calculates and uses the 2nd degree polynomial to offset, then plots, and if you use 2 it calculates and uses the 2nd degree polynomial to offset, then plots and SAVES data to a new file.

If this is correct, then this is a 1-2-3 tool for viewing some data, then adding some hockey-stickiness and finally saving the new data.

Nov 26, 2009 at 1:32 PM | Unregistered CommenterHPX

This is the bit of code and it appears to apply a decline of (cval=total(ally(kconst))/float(nconst)) to everything pre 1930 and then applies a "curve fit" to everything post 1930 whatever that means

Bizarrely it applies stuff to 1930 twice (LE 1930 and GE 1930) :)

Not looked at what this does yet

ycon=1930
declinets=fltarr(mxdnyr)*!values.f_nan
allx=timey(ktem)
ally=difflow
;
kconst=where(allx le ycon,nconst)
cval=total(ally(kconst))/float(nconst)
kkk=where(mxdyear le ycon)
declinets(kkk)=cval
;
kcurve=where((allx ge ycon) and (allx le 1994),ncurve)
allwt=replicate(1.,ncurve)
acoeff=[0.,1./7200.]
vval=curvefit(allx(kcurve),ally(kcurve),allwt,acoeff,$
function_name='funct_decline')
kkk=where(mxdyear ge ycon)
declinets(kkk)=vval(*)

Nov 26, 2009 at 1:41 PM | Unregistered CommenterRocky

This just goes to show that the free exchange of information on the Internet doesn't help a totalitarian dictatorial government. Precisely this sort of stuff, with these CRU emails, is why Labour wants to stifle the Internet.

Nov 26, 2009 at 1:42 PM | Unregistered CommenterNeal Asher

I just love the repeated use of the year 1400. After all, we wouldn't want the Mediaeval Warm Period to influence the results, now would we?

Nov 26, 2009 at 1:47 PM | Unregistered Commenterdcw

This is exciting news,

Just a few notes of caution:

This leak is probably too big to be a hoax, but bear in mind that there may be items within it which are intended to mislead. I know that the CRU probably haven't the witt to create a "Hitler Diary" but remember to qualify your comments with that as a possibility.

Those in the media and politics who's reputations are at stake with this leak, will not hesitate to use any indiscretion they can from the sceptic side. The routines for destroying critics are well established, and include repeated attacks / allegations about that person's private life (look at Berlusconi's media coverage), questions about their academic background / institution, and their associates / friends. Although politics should not influence science, many in positions of power in academia are seasoned marxists and are well versed in the use of these tactics.

The enigma machine may have been captured, and you are well on your way to deciphering the codes for the CRU, but to mix my metaphores, there is still a herd of elephants in the room: Met Office & Nasa to name two of the bigger buggers.

Good luck guys, but take care.

Keith

Nov 26, 2009 at 2:16 PM | Unregistered CommenterKeith

If I recall correctly from the files I was perusing mxd (which I believe is max density (tree rings)) is one of the factors which they massage the raw temp data with.

If this is right, then this is how they smear modern temp data with the high uncertainty tree ring data. The 2nd order poly is a dead giveaway, since there will always be an in crease sine 1850 (end of Little Ice Age) and therefore there will always be a hockey stick in the output.

Yeah, this looks alike another smoking gun. Would need to run data through it to see how it behaves.

Nov 26, 2009 at 3:09 PM | Unregistered CommenterAJStrta

We know that the IPCC used UCR data and calculations. What other models have relied on the data and output? In other words, assuming there’s some corruption in the UCR product and processes, what else has been corrupted?

Great site!

Nov 26, 2009 at 3:20 PM | Unregistered CommenterSC Mike

@Rocky: re "Bizarrely it applies stuff to 1930 twice (LE 1930 and GE 1930)"

Thats what it says in the comment: force the 1930 datapoint to the constant value and start from there with the polynomial. The first loop assigns the constant, the second loop uses this constant as a starting value. And modifies the 1930 value again. That way you don't get sudden 'jumps' in the output.

Pjotr

Nov 26, 2009 at 3:29 PM | Unregistered CommenterPjotrk

There are two linked files:

briffa_sep98_decline1.pro
briffa_sep98_decline2.pro

The description of these files is given as:

''On a site-by-site basis, computes MXD timeseries from 1902-1976, and
; computes Apr-Sep temperature for same period, using surrounding boxes
; if necessary. Normalises them over as common a period as possible, then
; takes 5-yr means of each (fairly generous allowance for
; missing data), then takes the difference.
; Results are then saved for briffa_sep98_decline2.pro to perform rotated PCA
; on, to obtain the 'decline' signal!
;

Fairly simple trick, though I'm not competent to follow all of the code.

Nov 26, 2009 at 3:55 PM | Unregistered Commenterastateofdenmark

MAJOR Caveat: I know jack all about IDL !!!!
Caveat: I have no idea what this program is supposed to do and whether it is doing anything underhand.
Caveat: This is my best interpretation.

Looking at the curvefit in the code it calls a function in its params called funct_decline_matchvar for all data from 1930 onwards to 1994.

The IDL help for curvefit says "FUNCTION_NAME: Use this keyword to specify the name of the function to fit."

The function that they are fitting the data to reads

pro funct_decline_matchvar,x,a,f,pder
;
cval=0.252306 ; need to update to ensure a smooth join
z=x-1930
f=cval+a(0)*z+a(1)*z*z
;
if n_params() ge 4 then begin
pder=[[z],[z*z]]
endif
;
end

Notice they hard code the offset to make it fit the previous element correctly (sigh).

cval=0.252306 ; need to update to ensure a smooth join

When called A = acoeff=[0.,1./7200.] and x is the year.

Therefore the curve being fitted is a parabola running from 0.252 in 1930 to 0.821 in 1994.

Plot this data in excel to see it (has working out for when I realise I have cocked up).

year cval z a0 a1 (1/7200) f
1930 0.252306 0 0 0.000138889 0.252306
1931 0.252306 1 0 0.000138889 0.252444889
1932 0.252306 2 0 0.000138889 0.252861556
1933 0.252306 3 0 0.000138889 0.253556
1934 0.252306 4 0 0.000138889 0.254528222
1935 0.252306 5 0 0.000138889 0.255778222
1936 0.252306 6 0 0.000138889 0.257306
1937 0.252306 7 0 0.000138889 0.259111556
1938 0.252306 8 0 0.000138889 0.261194889
1939 0.252306 9 0 0.000138889 0.263556
1940 0.252306 10 0 0.000138889 0.266194889
1941 0.252306 11 0 0.000138889 0.269111556
1942 0.252306 12 0 0.000138889 0.272306
1943 0.252306 13 0 0.000138889 0.275778222
1944 0.252306 14 0 0.000138889 0.279528222
1945 0.252306 15 0 0.000138889 0.283556
1946 0.252306 16 0 0.000138889 0.287861556
1947 0.252306 17 0 0.000138889 0.292444889
1948 0.252306 18 0 0.000138889 0.297306
1949 0.252306 19 0 0.000138889 0.302444889
1950 0.252306 20 0 0.000138889 0.307861556
1951 0.252306 21 0 0.000138889 0.313556
1952 0.252306 22 0 0.000138889 0.319528222
1953 0.252306 23 0 0.000138889 0.325778222
1954 0.252306 24 0 0.000138889 0.332306
1955 0.252306 25 0 0.000138889 0.339111556
1956 0.252306 26 0 0.000138889 0.346194889
1957 0.252306 27 0 0.000138889 0.353556
1958 0.252306 28 0 0.000138889 0.361194889
1959 0.252306 29 0 0.000138889 0.369111556
1960 0.252306 30 0 0.000138889 0.377306
1961 0.252306 31 0 0.000138889 0.385778222
1962 0.252306 32 0 0.000138889 0.394528222
1963 0.252306 33 0 0.000138889 0.403556
1964 0.252306 34 0 0.000138889 0.412861556
1965 0.252306 35 0 0.000138889 0.422444889
1966 0.252306 36 0 0.000138889 0.432306
1967 0.252306 37 0 0.000138889 0.442444889
1968 0.252306 38 0 0.000138889 0.452861556
1969 0.252306 39 0 0.000138889 0.463556
1970 0.252306 40 0 0.000138889 0.474528222
1971 0.252306 41 0 0.000138889 0.485778222
1972 0.252306 42 0 0.000138889 0.497306
1973 0.252306 43 0 0.000138889 0.509111556
1974 0.252306 44 0 0.000138889 0.521194889
1975 0.252306 45 0 0.000138889 0.533556
1976 0.252306 46 0 0.000138889 0.546194889
1977 0.252306 47 0 0.000138889 0.559111556
1978 0.252306 48 0 0.000138889 0.572306
1979 0.252306 49 0 0.000138889 0.585778222
1980 0.252306 50 0 0.000138889 0.599528222
1981 0.252306 51 0 0.000138889 0.613556
1982 0.252306 52 0 0.000138889 0.627861556
1983 0.252306 53 0 0.000138889 0.642444889
1984 0.252306 54 0 0.000138889 0.657306
1985 0.252306 55 0 0.000138889 0.672444889
1986 0.252306 56 0 0.000138889 0.687861556
1987 0.252306 57 0 0.000138889 0.703556
1988 0.252306 58 0 0.000138889 0.719528222
1989 0.252306 59 0 0.000138889 0.735778222
1990 0.252306 60 0 0.000138889 0.752306
1991 0.252306 61 0 0.000138889 0.769111556
1992 0.252306 62 0 0.000138889 0.786194889
1993 0.252306 63 0 0.000138889 0.803556
1994 0.252306 64 0 0.000138889 0.821194889

Nov 26, 2009 at 4:00 PM | Unregistered CommenterRocky

Also, in the Yamal folder there is a word document titled:

yamaltree.doc

This is a short paper titled 'A Continuous Multi-Millennial Ring-Width Chronology in Yamal, Northwestern Siberia, Yamal tree-ring chronology', by Rashit M. Hantemirov and Stepan G. Shiyatov, Institute of Plant and Animal Ecology, Ekaterinburg 620144, Russia.

These two have supplied tree ring data to Keith Briffa (see sf2note.txt) covering, they say, 4000 years. They were also asking Briffa what the status of their paper was, well this extract might give a clue:

''From the second half of the first century BC to the end of that millennium, generally warm conditions prevailed. The most favourable punctuated by conditions during the last two millennia apparently occurred between about AD 500 and 1400, though punctuated by cooler summers in 600-700 and at about 1000.''

The two russians were obviously not told that the MWP did not happen, regardless of what their data is telling them.

Keep up the good work.

Nov 26, 2009 at 4:03 PM | Unregistered Commenterastateofdenmark

I forgot to say column F is the curve ... plot with years as labels ...

If you cut and paste the data into excel and DATA / TEXT TO COLUMNS ... delimited by space this will almost work (although it screws up one header - pushes F heading one column right)

Nov 26, 2009 at 4:12 PM | Unregistered CommenterRocky

Difference between variance mode and regression mode runs are

* Different variable file loaded and saved 'calibmxd2.idlsave' or 'calibmxd2_regress.idlsave'
* regression starts at 1900, variance starts at 1930

other than that (and a different offset hardcoded into 'funct_decline_regress.pro' and 'funct_decline_matchvar.pro' to make the graphs match) they are identical.

Nov 26, 2009 at 5:09 PM | Unregistered CommenterRocky

Maybe this helps?

http://strata-sphere.com/blog/index.php/archives/11518

Nov 26, 2009 at 5:09 PM | Unregistered CommenterDave

Also check out the link he gives in his update:

http://esr.ibiblio.org/?p=1447

Nov 26, 2009 at 5:14 PM | Unregistered CommenterDave

Hi,

Would it be possible for someone to write a synopsis of the posts in this thread for the lay person?
My maths isn't bad, but I am on the edge of understanding, and certainly would not be able to translate what little I can grasp into a coherent description that I could convey to a person with very little understanding of mathematics.
It would be very handy. Any takers?

Thanks

Nov 26, 2009 at 5:19 PM | Unregistered CommenterEric

I can try ... but I must admit I don't understand it all or even where the program exists in the grand scheme of things or if there is a valid reason for doing what they did :)

The program we are looking at does something with the data its reading

; Use the calibrate MXD after calibration coefficients estimated for 6
; northern Siberian boxes that had insufficient temperature data.

but as a final step of processing it processes all the data up to 1930 to align the data around the average temperature up to 1930.

Then it CURVE FITS the data from 1930 onwards to a upward curve of ever increasing steepness.

So assuming every value was the same you would expect a plot that looks like this

____________________
|
1930

but from this program it looks like this

/
/
/
/
/
/
/
----------------------/
|
1930

Nov 26, 2009 at 5:41 PM | Unregistered CommenterRocky

C*ck, it strips spaces :)

Basically if you plot a flat line through this program it will have an agressive uptick at the end starting in 1930.

Nov 26, 2009 at 5:42 PM | Unregistered CommenterRocky

I wish I could delete my own posts :)

In laymans terms this program forces a hockey stick on any data fed in where the handle / head split is 1930.

The handle of the hockey stick is created by any data to 1930.

The head of the hockey stick starts at 1930 and has the shape shape of a formulat (years since 1930)^2 * (1 / 7200).

Roughly it gradually increments the incoming data by 0.75 of a "unit" in the 60 odd years from 1930 - 1994 leaving the previous years "unchanged".

Nov 26, 2009 at 5:49 PM | Unregistered CommenterRocky

Wasn't there a CRU e-mail about wanting to minimize the 1940s 'blip'?

Nov 26, 2009 at 6:08 PM | Unregistered CommenterDave

http://www.eastangliaemails.com/emails.php?eid=1017&filename=1254147614.txt

Here's the e-mail, but it's dated Sept. 2009.

Nov 26, 2009 at 6:15 PM | Unregistered CommenterDave

If only you could post an image here ...I could post my excel plot.

Nov 26, 2009 at 6:16 PM | Unregistered CommenterRocky

I think the key issue in this is what are these programs supposed to do and were the charts ever published. It could be junk code.

The charts have labels such as

'>50N masked temperature (xxxxx wrt 1961-90)'
'>50N temperature (masked)'
'>50N MXD reconstruction (masked)'
'>50N MXD reconstruction (masked then scaled)'
'Temperature difference (!Uo!NC)'
'Mean of differences'
'Difference of means'
'Difference pattern'

Nov 26, 2009 at 6:22 PM | Unregistered CommenterRocky

Rocky

If you email it to me at bishophill(roundthing)tiscali.co.uk I'll post it up.

Nov 26, 2009 at 7:12 PM | Registered CommenterBishop Hill

OK graph's in the head post. Have we pinned down exactly where this adjustment is used?

Nov 26, 2009 at 7:37 PM | Registered CommenterBishop Hill

I posted this earlier on Lucia's site.
.
This is an email from the alleged CRU documents.
.
From: Tom Wigley
To: Phil Jones
Subject: 1940s
Date: Sun, 27 Sep 2009 23:25:38 -0600
Cc: Ben Santer
x-flowed
Phil,
Here are some speculations on correcting SSTs to partly
explain the 1940s warming blip.
If you look at the attached plot you will see that the
land also shows the 1940s blip (as I'm sure you know).
So, if we could reduce the ocean blip by, say, 0.15 degC,
then this would be significant for the global mean -- but
we'd still have to explain the land blip.
I've chosen 0.15 here deliberately. This still leaves an
ocean blip, and i think one needs to have some form of
ocean blip to explain the land blip (via either some common
forcing, or ocean forcing land, or vice versa, or all of
these). When you look at other blips, the land blips are
1.5 to 2 times (roughly) the ocean blips -- higher sensitivity
plus thermal inertia effects. My 0.15 adjustment leaves things
consistent with this, so you can see where I am coming from.

.
Note the repeated reference to reducing ocean temps by 0.15 deg.
.
Now, this code is from the purported CRU file, osborn-tree6/briffa_sep98_d.pro
.
;
; Apply a VERY ARTIFICAL correction for decline!!
;
yrloc=[1400,findgen(19)*5.+1904]
valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,- 0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,$
2.6,2.6,2.6]*0.75 ; fudge factor
if n_elements(yrloc) ne n_elements(valadj) then message,’Oooops!’
;
yearlyadj=interpol(valadj,yrloc,timey)

.
.
If I am reading this right, the values for 1929 to 1949 are:
.
-0.1,-0.25,-0.3,0.,- 0.1
.
Which when averaged, gives:
.
-0.15 !!!!
.
The value 0f 0.75 * “valadj” may be compensating for the land SST, as suggested by Jones.
.
It also gives a nice hockey stick shape, as shown here.
.
http://esr.ibiblio.org/?p=1447

Nov 26, 2009 at 7:39 PM | Unregistered CommenterLes Johnson

Rocky, was the data you fed into the program a horizontal line?

Nov 26, 2009 at 7:49 PM | Unregistered CommenterDave

What it seems to be up to is starting with the observation that tree ring density and temperature diverge in certain places in the high north, trying to measure this divergence, map where it occurs, and then makes an attempt to bend the temperature data at just those points to match the alignment between the 'good' trees and the temperature.

It's not an unreasonable thing to do to see what it looks like. But it shouldn't be part of an actual temperature reconstruction because it is using circular reasoning. It uses the data to estimate the errors to correct the data with. Or to put it another way, it uses tree-rings when they're convenient, and thermometers when they're not. If one was to take the result and observe how well it fit the thermometers, as a way of gaining confidence in the reconstruction, that would be invalid. So would assuming that the divergence behaves as suggested and that this is correcting anything, at least, without evidence independent of the data here.

But we don't know yet that they've done that.

I don't think you can read anything into the hockey-stick shape of the bend applied. That's just the shape of the divergence. But selectively mixing in thermometer data and calling it tree-ring, that might well be another matter.

Nov 26, 2009 at 7:51 PM | Unregistered CommenterStevo

"Still, it is strange that one would want to put an adjustment like this through a temperature series."

Not if you are a Global Warming Science Believer and you need to prove your theory is correct even if you data do not.

Nov 26, 2009 at 7:57 PM | Unregistered CommenterFred

Thats just reverse engineered from the code. Its the curve they generate to fit the data too.

There is a lot of bits of code in that directory in the summer_modes directory there is code referencing a poster.

So it could be code to generate images for a poster.

But its a lot of code.

And why would they fudge the data, its bad enough already isn't it ?

Nov 26, 2009 at 8:13 PM | Unregistered CommenterRocky

@Stevo:

I am not sure what this code was used for but the directory seemed to be active (assuming the files are from the same place) in at least late 2004 as the PDF metadata says the files were created then.

The odd thing is the curve being added has no link to anything other that a constant ( 1 / 7200 ) and the square of the years since 1930, nothing to do with any other factors, its just a nice curve.

Its a fudge factor at best, maybe there is a mad science reason or perhaps it was a great fit to another curve they wanted to match which had valid credentials either way its a hardcoded fudge factor.

Nov 26, 2009 at 8:39 PM | Unregistered CommenterRocky

You know what, I've just been looking at the volume of code and its a hell of a lot just to produce some poster charts.

Excel anyone ?

Nov 26, 2009 at 9:38 PM | Unregistered CommenterRocky

http://www.americanthinker.com/2009/11/crus_source_code_climategate_r.html

/blockquote>

Plotting programs such as data4alps.pro print this reminder to the user prior to rendering the chart: IMPORTANT NOTE: The data after 1960 should not be used. The tree-ring density records tend to show a decline after 1960 relative to the summer temperature in many high-latitude locations. In this data set this "decline" has been artificially removed in an ad-hoc way, and this means that data after 1960 no longer represent tree-ring density variations, but have been modified to look more like the observed temperatures. Others, such as mxdgrid2ascii.pro, issue this warning: NOTE: recent decline in tree-ring density has been ARTIFICIALLY REMOVED to facilitate calibration. THEREFORE, post-1960 values will be much closer to observed temperatures then (sic) they should be which will incorrectly imply the reconstruction is more skilful than it actually is. See Osborn et al. (2004).

So maybe we shoulf dee Osborn et al. (2004). Is it this? http://stephenschneider.stanford.edu/Publications/PDF_Papers/OsbornBriffa.pdf

Nov 26, 2009 at 10:20 PM | Unregistered CommenterDave

Above post got truncated. Below is the fixed version.

http://www.americanthinker.com/2009/11/crus_source_code_climategate_r.html

For instance, in the subfolder "osborn-tree6\mann\oldprog," there’s a program (Calibrate_mxd.pro) that calibrates the MXD data against available local instrumental summer (growing season) temperatures between 1911-1990, then merges that data into a new file. That file is then digested and further modified by another program (Pl_calibmxd1.pro), which creates calibration statistics for the MXD against the stored temperature and "estimates" (infills) figures where such temperature readings were not available. The file created by that program is modified once again by Pl_Decline.pro, which "corrects it" – as described by the author -- by "identifying" and "artificially" removing "the decline."

But oddly enough, the series doesn’t begin its "decline adjustment" in 1960 -- the supposed year of the enigmatic "divergence." In fact, all data between 1930 and 1994 are subject to "correction."

And such games are by no means unique to the folder attributed to Michael Mann.

Plotting programs such as data4alps.pro print this reminder to the user prior to rendering the chart: IMPORTANT NOTE: The data after 1960 should not be used. The tree-ring density records tend to show a decline after 1960 relative to the summer temperature in many high-latitude locations. In this data set this "decline" has been artificially removed in an ad-hoc way, and this means that data after 1960 no longer represent tree-ring density variations, but have been modified to look more like the observed temperatures. Others, such as mxdgrid2ascii.pro, issue this warning: NOTE: recent decline in tree-ring density has been ARTIFICIALLY REMOVED to facilitate calibration. THEREFORE, post-1960 values will be much closer to observed temperatures then (sic) they should be which will incorrectly imply the reconstruction is more skilful than it actually is. See Osborn et al. (2004).

So maybe we shoulf dee Osborn et al. (2004). Is it this? http://stephenschneider.stanford.edu/Publications/PDF_Papers/OsbornBriffa.pdf

Nov 26, 2009 at 10:24 PM | Unregistered CommenterDave

Sheesh, that last part should have been:
So maybe we shoulf dee Osborn et al. (2004). Is it this? http://stephenschneider.stanford.edu/Publications/PDF_Papers/OsbornBriffa.pdf

Nov 26, 2009 at 10:28 PM | Unregistered CommenterDave

Sheesh, that last part should have been:
So maybe we should see Osborn et al. (2004). Is it this? http://stephenschneider.stanford.edu/Publications/PDF_Papers/OsbornBriffa.pdf

Sorry for all the typos

Nov 26, 2009 at 10:30 PM | Unregistered CommenterDave

Good stuff. This increasingly looks like the original "Briffa bodge" - in the Tornetrask paper discussed in some early CA posts.

Nov 27, 2009 at 1:34 AM | Unregistered CommenterSteve McIntyre

The MO seems to have been quite similar for the New Zealand data on the other thread.

http://nzclimatescience.net/images/PDFs/global_warming_nz2.pdf

Damp the data in the earlier part of the century then juice it up through the end.

Nov 27, 2009 at 3:17 AM | Unregistered CommenterRJ

Explains a lot.

Just before the story broke I posted this on another forum.

"My major problem is with the "reconstructions" particularly before 1958.

Pull out a rural record anywhere in the world and it will show that the 30's were as hot or hotter.than today.

I.e. Iceland Greenland Alaska
Illinois , Arctic Russia
Finland
New Delhi
Florida
Marble Bar
Adelaide
Canberra & Wagga
Norfolk Island
Iowa
Argentina
South Africa
North west Africa
Etc Etc, these are consistent around the world.

But when the world gets divided into boxes and the boxes without any records are filled by the nearest record which can be 1200km away which itself could have been filled with a record 1200km away , we end up with a shape that is completely different.

Thats like trying to guess Albany's temperature from Karatha or Port Hedland."

It is good to have my observations confirmed by the code from CRU.

Nov 27, 2009 at 3:48 AM | Unregistered CommenterRipper

few things ocurr to me

1 - we need a thorough, formal, public and forensic analysis of the whole story - I doubt any in the MSM or political world wil make this happen thoughp- I still think we are too late - although the leak at this juncture prior to Copenhagen is more than a coincidence I feel.

2 - am still confused by this "hide the decline" thing. As I understand it the tree ring data would suggest a drop in temperature compared with the measured data , so CRU attempted to replace the later years with actual measurements. But does this not tacilty accept that actual temperature is, in fact, rising.?

Or this just an artifact due to suppression of earlier data sets?

3 - but then we read that the last 10 years has seen a decline in actual temperatures. But surely 2 and 3 cannot both be correct (based on actual measurements)? What am I missing?

Finally - isn't the concept of an average temperature for the entire globe a bloody stupid concept anyway?

Nov 27, 2009 at 4:36 AM | Unregistered CommenterHysteria

I wanted to see if you can get raw data from nasa showing surface temperatures.
Nasa has a site dedicated to climate.
What astonishes me, is that they still use the CRU hockeystick on that site.
Could they not create a graph that nasa themselves created?

Also, a controversial graph, that is known to have been manipulated, has no place on nasa's website.
It would be a dignified response by nasa to remove tainted graphs.

See: climate.nasa.gov.

Nov 27, 2009 at 5:22 AM | Unregistered CommenterBram Stolk

"Also, a controversial graph, that is known to have been manipulated, has no place on nasa's website.
It would be a dignified response by nasa to remove tainted graphs."

That's not all - Prof Will Steffen in backward Australia continues to use the original hockey stick in his presentations!

Nov 27, 2009 at 5:26 AM | Unregistered CommenterVin

@Hysteria -

1 - I'm reading and hearing that Copenhagen will fail. Not because of this, but based on the demands being put forth by the third world, as well as organized "Anti-Globalization" resistance. http://www.naomiklein.org/articles/2009/11/copenhagen-seattle-grows I don't care what makes it fail, it's tyrannical. People who aren't a part of the IMF or World Bank *need* it to fail. Show up at your local demonstration and bring your particular message.

2 - As I understand it, (and I may not), the tree ring data after 1960 diverges radically from the instrument temperature data. Again, as I understand it, this means that temperature and whatever particular tree ring method they are using don't have a causal relationship and the whole data set ought to be thrown out, not just the parts that don't work for their intended use. It's not as if physics and biology took a sudden drastic turn in 1960. It's an instance of "cherry-picking", and it's a major one. I wonder if the instrument temperature data was "cherry picked" as well. Really, if the tree rings don't match with instrument observations, it seems obvious that you can't trust tree ring data for historic temperature at all.

3 - I can't find any sort of consensus on what happened in the last 10 years.

I don't know how widely this is circulating, but apparently IPCC are at odds w/ the Hadley Centre, who says that adjusting for natural phenomena, there's been 0.0 degrees C of warming. http://www.spiegel.de/international/world/0,1518,662092,00.html

PANIC!

I think an accurate average temperature for the whole planet would be a cool metric to have, but I don't think I believe anyone's got one. Maybe someday.

Nov 27, 2009 at 7:13 AM | Unregistered Commenterrainfade

For what it's worth, David Deming's paper, which gives some background on the obliteration of the medieval warm period in layman's terms, is available here: http://www.scientificexploration.org/journal/jse_19_2_deming.pdf

Nov 27, 2009 at 9:39 AM | Unregistered Commenteraidey

Just to make what a posted clearer...

Think of the graph posted as a sort of cookie cutter. Look at the world temperature graph and you will see that the 1930 origin is always going to be at a temperature lower than the 1930's. The effect of the application of the cutter is to reduce any temperatures above the line by some factor (haven't looked back to see how they calculate that) which reduces them. Effectively, everything above the line is squashed down a bit. Because a parabola is being used, the effect is much stronger at 1930 than in 1970, and by 1990 the parabola is going to be well above the top temperatures.

But in doing this, they are also introducing a parabolic factor into the data. If you fit a curve through the data and compress it everywhere by 50%, the curve will be flattened. If you compressed it relative to how far you are after 1930, the curve would be tilted upwards. Using a parabola causes the curve to become parabolic, meaning that at the end it accelerates upwards.

Nov 27, 2009 at 9:42 AM | Unregistered CommenterJames Smith

I notice a few people doubting the legitimacy of tree ring data, but assuming that modern 'observed' temperatures must be correct. Have a look at surfacestations.org to see why this is not necessarily so!

Nov 27, 2009 at 9:42 AM | Unregistered CommenterDavid

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Post:
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>