In 1897 Lord Kelvin, a former President of the Royal Society in London declared that the atmosphere would exhaust its oxygen in some 300 years as a result of burning coal. He was wrong about the atmosphere. So what about the reliability of present forecasts of atmospheric evolution?
The processes at work in the atmosphere are complicated, coupled to external and surface influences, in particular the ocean. Their interactions may be beyond computer simulation over the tens of years’ time scales used in climate modelling.
Any approach to determining economic policy for climate change should take into account the possibility that the current understanding of the atmosphere may not be translatable into reliable forecasts with a precision that allows the design of an economic response.
Further, any economic forecasts that are used to construct models of future carbon use and carbon dioxide emissions will be unable to deal with technical innovations. Their success cannot be predicted. This impacts on policy in two ways, first the obvious uncertainty in estimating economic development but more immediately the desire of governments to stimulate technical solutions. The need to be seen to be taking action frequently descends to picking winners and creating classes of rent seekers.
The problem faced by the Garnaut Enquiry may be reduced to the question “What is the magnitude of the risk of human induced climate change?” The first question should be “How do we assure ourselves that the risk assessments are reliable?” After all the assessments are based on climate modelling. A quote from a well known IPCC scientist Dr Kevin Trenberth should act as a caution:
None of the models used by the IPCC are initialized to the observed state, and none of the climate states in the models correspond even remotely to the current observed climate.
In the 20th century, policy makers based decisions on scientific knowledge and understanding and where possible experimental confirmation. This was the case in the development of atomic weapons. In these programs there were issues of risk. Would a controlled chain reaction in an atomic pile at the University of Chicago run out of control and destroy the south side of the city? Some hoped that it might. Would a hydrogen bomb set off a chain reaction in the ocean that would consume the planet? Could the yield of the weapons be accurately estimated? There were answers to these questions and there were experimental demonstrations to check calculations. In some cases there were substantial errors in calculation which pointed to a gap in understanding.
For our atmosphere we have a much more complicated set of interactions than those in nuclear physics. We need to have some ways of checking climate predictions. It is well known that weather forecasts are useful for only a matter of days. Temperatures are the best forecast variable, with rain and wind speed being substantially worse. There do not appear to be any statistical analyses that cover the accuracy of yearly or five-yearly forecasts. Until there is some measure of performance it seems unwise to take 50 or 100 year forecasts as providing any guidance.
The second problem, the inability to forecast innovation is not surprising. The Garnaut Enquiry is required to look forward 100 years. Models of energy use are needed for stationary and moving energy users.
Consider the start of the 20th century and ask if it were possible to predict, let alone model, the industrial development that occurred in that century?
We have examples within our own lifetimes of inventions that have changed the way we live. In the second half of the 20th century, the invention of the transistor followed by the integrated circuit has led to an extraordinary flowering of micro-electronics. Most appliances from pacemakers to telephones to motorcars have integrated circuits imbedded in them or central to their function. The change in energy use in moving from thermionic valves to transistors can be illustrated by the comparison of a pentode vacuum valve using some 10 watts of electrical power, mostly for heating a filament to produce electrons, with a Pentium processor of 10 to 15 million transistors using 20 watts.
The growth of the Internet, like the creation of a nervous system, continues with consequences for living and working which will take some time to appear. The consequences of the reduction of energy demand in appliances, the use of wireless communications and other technologies will doubtless play a large part in the less energy intensive economic improvements of the developing world.
Discuss in our Forums
See what other readers are saying about this article!
Click here to read & post comments.
18 posts so far.
Tom Quirk is a director of Sementis Limited a privately
owned biotechnology company. He has been Chairman of the Victorian Rail Track
Corporation, Deputy Chairman of Victorian Energy Networks and Peptech Limited
as well as a director of Biota Holdings Limited He worked in CRA Ltd setting up
new businesses and also for James D. Wolfensohn in a New York based venture
capital fund. He spent 15 years as an experimental research physicist,
university lecturer and Oxford don.