![]() We would then see that the majority are running far too hot by 2001.įigure 5 shows how HadCRUT4 compares with the model world. To have any value or meaning all of the models should have been initiated in 1850. Well done IPCC! The detail of what has happened since 2001 is shown in Figure 6. So how does this compare with reality?įigure 5 From 1850 to 2001 the IPCC background image is plotting observations (not model output) that agree with the HadCRUT4 observations. The actual temperature record is spliced onto the proxy record and the model world is spliced onto that to create a picture of future temperature catastrophe. In the imaginary world of the IPCC, Northern Hemisphere temperatures were constant from 1000 to 1900 AD with not the faintest trace of Medieval Warm Period or Little Ice Age where real people either prospered or died by the million. This was the year (I believe) that the hockey stick was born (Figure 4). I could not find a summary of the Second Assessment Report (SAR) from 1994 and so jump to the TAR (third assessment report) from 2001. But 13 years have since past and we can see how reality compares with the models in that very short time period. The observed temperature record is spliced onto the proxy record and the model record is spliced onto the observed record and no opportunity to examine the veracity of the models is offered. The High model is already running about 1.2˚C too warm in 2013.įigure 4 The TAR (2001) introduced the hockey stick. It should be abundantly clear that the Low model is the one that lies closest to the reality of HadCRUT4. I cannot easily find the parameters used to define the Low, Best and High models but the report states that a range of climate sensitivities from 1.5 to 4.5˚C are used. It should be quite clear that the best model is the Low Model. HadCRUT4 data was downloaded from WoodForTrees and annual averages calculated.įigure 3 shows how the temperature forecasts from the FAR (1990) compare with reality. As we shall see, the IPCC have an extraordinary lax approach to temperature datums and in each example a different adjustment has to be made to HadCRUT4 to make it comparable with the IPCC framework.įigure 3 Comparison of the FAR (1990) temperature forecasts with HadCRUT4. I have simply cut and pasted IPCC graphics into XL charts where I compare the IPCC forecasts with the HadCRUT4 temperature reconstructions. My approach is much more simple and crude. The only scientists that this reality appears to have escaped are those attached to the IPCC. The fact that reality tracks along the low boundary of the models has been made many times by IPCC sceptics. The best example I am aware of was done by Roy Spencer who produced this splendid chart that also drew some criticism.įigure 2 Comparison of multiple IPCC models with reality compiled by Roy Spencer. I am aware that many others will have carried out this exercise before and in a much more sophisticated way than I do here. So how does this compare to what came to pass and with subsequent IPCC practice? All models are initiated in 1850 and by the year 2000 already display significant divergence. Holding the key variable constant (CO2 emissions trajectory) allows the reader to see how different scientific judgements play out. Hence the differences between Low, Best and High estimates are down to different physical assumptions such as climate sensitivity to CO2. Is this the best forecast the IPCC has ever made? It is clearly stated in the caption that each model uses the same emissions scenario. I conclude that nothing has been learned other than how to obfuscate, mislead and deceive.įigure 1 Temperature forecasts from the FAR (1990). Examining past reports is quite enlightening since it reveals what the IPCC has learned in the last 24 years. This provides an opportunity to examine what has been forecast with what has come to pass. The IPCC (Intergovernmental Panel on Climate Change) has now published 5 major reports, the First Assessment Report (FAR) in 1990. This informs the commercial decision making process. At the outset, the models are always wrong but as more data is gathered they are updated and refined to the point that they have skill in hind casting what just happened and forecasting what the future holds. An important part of the modelling process is to compare model realisations with what actually comes to pass after oil or gas production has begun. These reservoir models are likely every bit as complex as computer simulations of Earth’s atmosphere. A good example would be 4D simulation of fluid flow in oil and gas reservoirs. In geology we use computer models to simulate complex processes.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |