Buy

Books
Click images for more details

Support

 

Twitter
Recent comments
Recent posts
Currently discussing
Links

A few sites I've stumbled across recently....

Powered by Squarespace

Discussion > Where is Rhoda's Evidence? (plagiarised by Dung)

The first religion whose Revelation is via the Holy Software tended by its Priesthood.

Nov 21, 2015 at 3:39 PM | Unregistered CommenterBig Oil

Martin A

Most of the verification tests you describe can be done on climate models. Why do you assume that they have not been done?

Nov 21, 2015 at 4:58 PM | Unregistered CommenterEntropic man

EM:

Most of the verification tests you describe can be done on climate models. Why do you assume that they have not been done?
Are you really stupid, or what? How, on God’s good Earth, can a verification be based upon a model? A verification, surely, has to be based upon something other than a model; surely, it has to be based upon an observation, be it observation of an event or an experiment? Basing it upon the model itself has be somewhat circuitous, even in your own limited imaginings, EM? Or is that concept a bit beyond your own comprehension?

Nov 21, 2015 at 5:55 PM | Registered CommenterRadical Rodent

To be fair to EM, there are verifications you can do without real life data, in software engineering we call them unit tests and integration tests. These are not about proving the entire system does what it is supposed to do (model climate) but are all about making sure the modules and units of the code do what you intended them to do.

If you have a module (function, unit, whatever) which models some phenomenon, e.g. change in CO2 sink per change in hectares of greening, then you are basically coding a function, a mathematical theory which relates two or more variables. You know how this simple mathematical relationship is theorised to work, but there is a better than even chance that your first stab at coding it, you will introduce errors. These are not errors in your theory, which might be independently wrong. This is plain old implementation errors, introduced by the normal mistakes made when coding, you forgot to multiply by the right coefficient, you meant to cube it instead of squaring it, you introduced a rounding or truncation error by the choice of variable type, the forgot to clear a running total variable before each outer loop, etc. These are errors in coding, not the theory.

What a unit test would do is to separate that function off from the rest of the system, isolate it from all the other calculations going on, and then run the function on its own with a known set of inputs and outputs (these are called stubs or mocks in software development). This is to ensure that the simple module behaves as it is supposed to. This is not the same thing as saying it is accurately modelling the physics, just that it is adequately modelling your intended hypothesis about the relationship. Once the unit test 'passes' you can plug it back into the system.

The slightly more complex form of testing is the integration test, where you take two or more of these tested units and join them together and isolate them from the larger system, and see if they behave as expected. Again, you are not checking to see if the combined system accurately models the real world physics, but just that the calculations adhere to what you intend for the hypothesis, this time on a more macro scale.

Verification in the form of comparison with real life, can and should be done at each of these three levels in addition to the functionality testing described above.

Part of where I have problems with climate models is that in order to match the top-level macro signal, they have to run some of the modules at unrealistic levels, which don't match what we observe. This should set off alarm sugnals., but doesn't seem to.

Nov 24, 2015 at 10:55 AM | Unregistered CommenterTheBigYinJames

TBYJ, you are so much more polite in your put-downs than I.

Nov 24, 2015 at 12:58 PM | Registered CommenterRadical Rodent

For EM:

"Climate is a complex, chaotic system which is not deterministic at our level of knowledge. Any projections come with confidence limits. You seem uncomfortable with that. How do software engineers handle uncertainty in other fields?
Modelling turbulent airflow over an airframe comes to mind as a possible example."

Given your interest in modelling as an analysis tool, you might like to join this body

++++
http://www.nafems.org/join/

NAFEMS is the international association for the engineering analysis, modelling and simulation community. It has been our mission over the last 30 years to facilitate and promote the safe and efficient use of engineering simulation and analysis. By joining our association, members have the unique opportunity to be part of our growing community.

Within our neutral community, companies come together to share experiences as well as take advantage of vital ‘best practice’ information, acclaimed publications and industry-recognised training programs designed with the needs of the community in mind.
++++

An example starter problem is here - worthy of consideration even though it is less complex than the climate models you appear to know and understand:

http://www.nafems.org/blog/posts/nbc04/

Apologies if you are already a member or if you have already solved many similar exercises as you've developed your skills.

Nov 24, 2015 at 2:00 PM | Unregistered Commenternot banned yet

Dung

I've just realised the answer to your question. And it came about by reading Ferdinand Engelbeen's post about attributing CO2 emissions.

Now his case actually makes sense. I haven't read Salby's work so there may or may not be more to it. However the rate of change equation for CO2 makes sense. Basically if you don't emit large amounts of CO2 that can be regulated i.e. reach constant levels over time due to residence time of CO2, then CO2 levels are largely due to changes in temperature with some change in vegetation. Hence pre industrial times' temperatures can be estimated by CO2 levels.

They do this with ice cores and dC13. It turns out an estimate is 1 degree reduces a change of 10 ppmv.

Now if you emit large quantities what happens is that partial pressure balancing dominates hence roughly half what you emit stays in the atmosphere. Again dC13 levels show this

dC13 levels over time

Without focussing on problems with using proxies and other things like that, some I don't know in detail, the takeaway from the dC13 data especially is that before 1850 there appears to be variations in CO2 of around 10ppmv (peak to peak) on an average of 280 ppmv. Hence over centuries temperatures varied by up to a degree due to natural variation.

Since the industrial revolution and especially after 1950 we see humans attributing 120 ppmv more. Yet, temperatures have varied by again up to a degree.

So by putting in 120 ppmv more of CO2 we haven't managed to observe a temperature increase any different than natural, or that which is associated with 10 ppmv. In effect temperature variation with CO2 levels changing by 10 ppmv appears the same when they change by 120 ppmv.

And I haven't even talked about temperature adjustments.

The post is here - Growth of CO2 in the atmosphere

Nov 26, 2015 at 2:18 PM | Registered CommenterMicky H Corbett