Key Global Warming Study May Have Bad Mathematics 77
An anonymous reader writes "Berkeley physics professor Richard A. Muller writes that a key study showing a sudden 'hockey stick shape' increase in global temperature may be flawed from bad mathematics. Stephen McIntyre and Ross McKitrick say that Michael Mann's computer program handled data normalization incorrectly and exaggerated data with a hockey stick shape." Update: 10/18 18:26 GMT by J : Alas for the environment, it looks like McKitrick and McIntyre have been refuted. "In previous rounds of the debate, Lambert has shown that McKitrick messed up an analysis of the number of weather stations, showed he knew almost nothing about climate, flunked basic thermodynamics, couldn't handle missing values correctly and invented his own temperature scale. But Tim's latest discovery really takes the cake."
Study rejected by the science magazine Nature (Score:1, Informative)
Re:Study rejected by the science magazine Nature (Score:2, Informative)
Yeah, and you can read all about it here [uoguelph.ca], including the actual reviewer comments.
Re:The 'Little Ice Age' (Score:3, Informative)
The Little Ice Age in Europe from 1400-1850 is now thought to have been caused by an abnormal lack of SUNSPOTS. Sunspots cause the sun to give off alot more heat/energy than a nicely uniformed surface sun does.
Sunspots don't cause more energy to come from the sun - the fact is that sunspots are cooler than the rest of the sun's surface. Sunspots are, however, symptoms of an active sun. Just as low levels of sunspots occur when the sun is less active.
Too bad we only have about 1,000 years of data on sunspots.
Sunspots have been directly observed and recorded in reasonable detail only since Galileo's time. But in another sense, we actually have data going back much farther than that. During periods of high solar activity, the sun bombards the earth with a larger number of subatomic particles. This type of radiation results in constant isotope formation - in particular, this is why things left exposed on the earth's surface keep a constant concentration of Carbom 14. Isotope ratio measurements have in fact been used to infer changes in solar activity for periods during which nobody was recording sunspot counts.Also, while higher solar activity heats the earth, the main part of the effect is actually very indirect. Most of the extra heating associated with a hotter sun cannot be explained by radiation alone. What happens is that the higher particle flux strengthens the earth's magnetosphere, and somewhat ironically, this means the atmosphere is better protected from being eroded away by solar wind precisely when solar wind is most dense! The slightly thicker "blanket" of atmosphere allows the earth to retain a bit more heat.
This fact doesn't completely debunk the manmade global warming hypothesis. Changes in solar activity probably "only" account for 75% of the climate change since the end of the Little Ice Age. The other 25% could all easily be the work of mankind. But there are two reasons I don't think we should panic yet. First, there's mounting evidence that the Medieval Warm Period (which preceded the Little Ice Age) peaked at levels even warmer than what we now experience. So modern temperatures aren't really unprecedented. Secondly, all of the four previous interglacials peaked at much higher temperatures than ours has. So, a very long view of climate shows that on fairly regular intervals, the earth experiences temperatures similar to those we now have, even without any help from mankind.
Re:The 'Little Ice Age' (Score:4, Informative)
It appears there is an increase of 2 watts/m^2 at the earth's distance from the sun.
Re:Junk science strikes again (Score:3, Informative)
True; temperatures have been rising for 300 years or so.
Largely true; we know that CO2 and water vapor in the atmosphere hold in heat, and that water vapor holds in much more. We also know that this effect is critical for keeping our planet from freezing solid.
We also know that increasing CO2 levels have a minor impact as a greenhouse gas, but may cause a domino effect: the small addition of heat from CO2 might cause more water vapor to stay in the atmosphere, which could have a large heating effect. But it's also possible that a counter-balancing mechanism would kick in to prevent this. The latter seems likely, since we know there have been periods in history with much higher CO2 levels than today, but without runaway global warming.
True, and the rate seems to be increasing as well. From 1900 to 1940 CO2 levels rose around 18%, and from 1940 to 2000 they rose around 80%.
However, most of the warming that occured between 1900 and 2000 occurred before 1950. It seems that rising CO2 levels follow warmings, rather than preceed them. This suggests that the higher temperatures are causing the higher CO2 levels, rather than the other way around.
Not really true. Sure, we're taking carbon out of the ground and releasing it into the atmosphere, but only 2/3 of it seems to stay there; reabsorption occurs, and may be able to hold more than we know. Furthermore, our contribution may be insignificant compared to what the earth is releasing. As I stated above, CO2 increases seem to be caused by higher temperatures (perhaps released from warmer oceans) and that contribution might be much more than we release.
We're not really replacing the carbon sinks either. In some areas we might be, but it's a fact that the forests in the United States, and probably much of the rest of the world, are growing in size, largely due to the higher amount of CO2 in the atmosphere. Pine trees grow up to three times faster at today's CO2 levels than at the levels in 1900, and all other plants grow faster too (anywhere from 10% to 300%, depending on species and conditions.) This could very well be the 'counter-balance' mechanism that prevents runaway global warming; higher CO2 levels cause the vegetative carbon sinks to grow more plentiful.
For more on all of this, watch this seminar from the Oregon Institute of Science and Medicine [oism.org].