Posted by: Dan | February 21, 2009

A History of Greenhouse Gases

I was recently motivated to re-read some items on the background of greenhouse gases. I was reminded that Chapter 1 (“A Historical Overview of Climate Change Science”) of WG1’s report on the physical basis of climate change in the fourth IPCC assessment report is a very good resource — remember, it’s always good to know the history of science when talking about science.

Below the fold is an excerpt from section 1.4: Examples of Progress in Understanding Climate Processes. Check it out, and familiarize yourself with the science underlying conclusions on climate change.

The realization that Earth’s climate might be sensitive to the atmospheric concentrations of gases that create a greenhouse effect is more than a century old. Fleming (1998) and Weart (2003) provided an overview of the emerging science. In terms of the energy balance of the climate system, Edme Mariotte noted in 1681 that although the Sun’s light and heat easily pass through glass and other transparent materials, heat from other sources (chaleur de feu) does not. The ability to generate an artificial warming of the Earth’s surface was demonstrated in simple greenhouse experiments such as Horace Benedict de Saussure’s experiments in the 1760s using a ‘heliothermometer’ (panes of glass covering a thermometer in a darkened box) to provide an early analogy to the greenhouse effect. It was a conceptual leap to recognize that the air itself could also trap thermal radiation. In 1824, Joseph Fourier, citing Saussure, argued ‘the temperature [of the Earth] can be augmented by the interposition of the atmosphere, because heat in the state of light finds less resistance in penetrating the air, than in repassing into the air when converted into non-luminous heat’. In 1836, Pouillit followed up on Fourier’s ideas and argued ‘the atmospheric stratum…exercises a greater absorption upon the terrestrial than on the solar rays’. There was still no understanding of exactly what substance in the atmosphere was responsible for this absorption.

In 1859, John Tyndall (1861) identified through laboratory experiments the absorption of thermal radiation by complex molecules (as opposed to the primary bimolecular atmospheric constituents O2 and molecular nitrogen). He noted that changes in the amount of any of the radiatively active constituents of the atmosphere such as water (H2O) or CO2 could have produced ‘all the mutations of climate which the researches of geologists reveal’. In 1895, Svante Arrhenius (1896) followed with a climate prediction based on greenhouse gases, suggesting that a 40% increase or decrease in the atmospheric abundance of the trace gas CO2 might trigger the glacial advances and retreats. One hundred years later, it would be found that CO2 did indeed vary by this amount between glacial and interglacial periods. However, it now appears that the initial climatic change preceded the change in CO2 but was enhanced by it (Section 6.4).

G. S. Callendar (1938) solved a set of equations linking greenhouse gases and climate change. He found that a doubling of atmospheric CO2 concentration resulted in an increase in the mean global temperature of 2°C, with considerably more warming at the poles, and linked increasing fossil fuel combustion with a rise in CO2 and its greenhouse effects: ‘As man is now changing the composition of the atmosphere at a rate which must be very exceptional on the geological time scale, it is natural to seek for the probable effects of such a change. From the best laboratory observations it appears that the principal result of increasing atmospheric carbon dioxide… would be a gradual increase in the mean temperature of the colder regions of the Earth.’ In 1947, Ahlmann reported a 1.3°C warming in the North Atlantic sector of the Arctic since the 19th century and mistakenly believed this climate variation could be explained entirely by greenhouse gas warming. Similar model predictions were echoed by Plass in 1956 (see Fleming, 1998): ‘If at the end of this century, measurements show that the carbon dioxide content of the atmosphere has risen appreciably and at the same time the temperature has continued to rise throughout the world, it will be firmly established that carbon dioxide is an important factor in causing climatic change’ (see Chapter 9).

In trying to understand the carbon cycle, and specifically how fossil fuel emissions would change atmospheric CO2, the interdisciplinary field of carbon cycle science began. One of the first problems addressed was the atmosphere-ocean exchange of CO2. Revelle and Suess (1957) explained why part of the emitted CO2 was observed to accumulate in the atmosphere rather than being completely absorbed by the oceans. While CO2 can be mixed rapidly into the upper layers of the ocean, the time to mix with the deep ocean is many centuries. By the time of the TAR, the interaction of climate change with the oceanic circulation and biogeochemistry was projected to reduce the fraction of anthropogenic CO2 emissions taken up by the oceans in the future, leaving a greater fraction in the atmosphere (Sections 7.1, 7.3 and 10.4).

In the 1950s, the greenhouse gases of concern remained CO2 and H2O, the same two identified by Tyndall a century earlier. It was not until the 1970s that other greenhouse gases – CH4, N2O and CFCs – were widely recognized as important anthropogenic greenhouse gases (Ramanathan, 1975; Wang et al., 1976; Section 2.3). By the 1970s, the importance of aerosol-cloud effects in reflecting sunlight was known (Twomey, 1977), and atmospheric aerosols (suspended small particles) were being proposed as climate-forcing constituents. Charlson and others (summarised in Charlson et al., 1990) built a consensus that sulphate aerosols were, by themselves, cooling the Earth’s surface by directly reflecting sunlight. Moreover, the increases in sulphate aerosols were anthropogenic and linked with the main source of CO2, burning of fossil fuels (Section 2.4). Thus, the current picture of the atmospheric constituents driving climate change contains a much more diverse mix of greenhouse agents.


For more on the best estimates of radiative forcings from long-lived greenhouse gases and every other forcing studied to date, check out Section 2.9 of the fourth IPCC assessment report: Synthesis of all available scientific knowledge on changes in atmospheric constituents and in radiative forcing. The section is best summed up in Table 2.12.

Feel free to discuss, but please, refrain from comments insinuating a global conspiracy amongst climate scientists. I’m not too welcoming of conspiracy theorists.

Also, if asked, I’ll be happy to cite significant passages in the other chapters of the WG1 report as well.

UPDATE:


Responses

  1. In your sign off from our other exchange, you referred me here.

    I agree it’s important to know the history of this science. Weart is a good source, very thorough, but of course, his narrative is triumphalist, as is obvious from his title. I’ve not read anything like it since I was a boy reading Microbe Hunters by Paul de Kruif

    So, I went and read through the section in the IPCC report on Radiative Forcings in section 2. I’m not sure why you think this clinches the argument. The chapter is very dense, and not written in a way that makes it easy to follow the logic, so perhaps you can address my concerns.

    RF is defined by the IPCC in a particular way to provide it with a metric for noting the change in energy flux in and out of the Earth’s atmosphere. It is defined as pertaining to perturbations in the climate system beyond the natural forcings. Baseline for a natural system is taken as the pre-industrial year 1750.

    Okay, but please note that this assumes that the anthropogenic forcings are there, sort of like assuming the thing you want to prove.

    But RF isn’t measured directly, and it’s not intended to be. It’s simply a metric. It makes more sense than global temperature, of course, because it’s basically a supposedly measured value of energy flux at the outer atmospheric layer. Not all that unreasonable to assume that way up there it’s a constant function around the sphere. Surface temperatures are no way a constant function over the earth.

    I read through quite a lot of it the section. The bulk of it deals with the individual forcings of various human emissions and activities, and so is logically posterior to the main argument and rationale. If that is sound, then it is all sound. If the assumptions are good, then off we go.

    But in reading through the sections on what RF is, on how it is developed, and on Climate Sensitivity, a parameter that translates RF into actual changes in global temperature, I developed the following concern:

    RF and Climate Sensitivity (CS) are developed from models and comparisons to the temperature record. Doesn’t GIGO (garbage in-garbage out) apply?

    What if the temperature record is not quite accurate?

    Why is a global “homogenization” of temperature data (that’s the IPCC/Hansen term – makes me think of sausage making and legislative horse trading – you don’t want to see how it’s done) a reliable base for models that are predicting very small – in absolute terms – changes?

    Doesn’t the entire approach assume that the models correctly reproduce the climate system? Isn’t that the reverse of science; don’t we usually try to disprove a hypothesis, rather than assume it’s correct and go from there?

    Has anyone tried to disprove the current assessment of the temperature record other than McKitrick and his criticisms of the Hockey Stick. So much work is required to deal with that data, why would anyone bother with it unless they were a very dogged critic or a committed modeler.

    The only justifications I saw for various assessments of RF and CS were the GCMs and the the comparison of GCM output to the temperature record…is there another?

    Engineers don’t care about fundamental truths of nature, they just want things to work. So, in studying materials, say, they are quite willing to massage data into a homogenous series if that can be used for a model that will tell them how to build a stronger column, for example. They know the result they want, and it’s easy to test if they have gotten it. Column stands, column buckles. They are willing to “improve” the data as much as necessary to get the result. How is this different from GCM modelers “improving” their statistical refinement of the temperature record and then improving models?

    I have done a lot of watershed modeling. It works as long as you only want to know about whether there will be more, a lot more, or less, a lot less water than before. The GCMs remind me a lot of the classic hydrological modeling problem – The Ungauged Watershed. It goes like this:

    Waterhsed A has characteristics a, b, c, d e.g. area, average slope, land cover, soil type, etc. A long-standing flow gauge is available at the outlet, and complete rainfall records are available.

    Watershed B is similar, but not identical to A, and also has rainfall records, but no flow gauge. What will the annual flow pattern be out of Watershed B.

    The fact is, after years of model development, we just don’t know. At least, we don’t know to within 1%, or 5%, or probably not 10% and maybe not even to 25%. It’s just too complex to get right. And it’s much simpler than the climate system. There’s not even any dispute about the base data!

  2. Welcome Lichanos. Despite the disagreements that surely will pop up, I’m glad to have you – perhaps through discussion with you we can both refine each others misconceptions of the IPCC report (I’m sure you’ll find some that I have). To your concerns though:

    RF is defined by the IPCC in a particular way to provide it with a metric for noting the change in energy flux in and out of the Earth’s atmosphere. It is defined as pertaining to perturbations in the climate system beyond the natural forcings. Baseline for a natural system is taken as the pre-industrial year 1750.

    Okay, but please note that this assumes that the anthropogenic forcings are there, sort of like assuming the thing you want to prove.

    They spend a great deal of time on the anthropogenic forcings, true. But they don’t ignore the natural forcings. I take 1750 to be a starting point to improving measurements, not necessarily a baseline. And another chapter in the WG1 deals with earlier reconstructions of paleoclimate.

    I think that the primary assumptions that go into chapter 2 are actually addressed in chapter 1 (why I posted on chapter 1 first, despite talking about 2), especially in FAQ1.1 on page 96: “What Factors Determine Earth’s Climate?” The figure in that FAQ explains the conceptual model that climatology is contemporarily based upon. Figure 1.2 also notes the evolution of the conceptual model over the course of the four IPCC assessment reports.

    But RF isn’t measured directly, and it’s not intended to be. It’s simply a metric. It makes more sense than global temperature, of course, because it’s basically a supposedly measured value of energy flux at the outer atmospheric layer. Not all that unreasonable to assume that way up there it’s a constant function around the sphere. Surface temperatures are no way a constant function over the earth.

    Agreed. The notion of RFs is very theory-laden. Thus is must be confirmed or refuted with direct measurements, which are discussed in later chapters.

    On to your concerns, which I’ve numbered for easy discussion and to avoid breaking it up into many blockquoted sections.

    But in reading through the sections on what RF is, on how it is developed, and on Climate Sensitivity, a parameter that translates RF into actual changes in global temperature, I developed the following concern:

    [1] RF and Climate Sensitivity (CS) are developed from models and comparisons to the temperature record. Doesn’t GIGO (garbage in-garbage out) apply?

    [2] What if the temperature record is not quite accurate?

    [3] Why is a global “homogenization” of temperature data (that’s the IPCC/Hansen term – makes me think of sausage making and legislative horse trading – you don’t want to see how it’s done) a reliable base for models that are predicting very small – in absolute terms – changes?

    [4] Doesn’t the entire approach assume that the models correctly reproduce the climate system? Isn’t that the reverse of science; don’t we usually try to disprove a hypothesis, rather than assume it’s correct and go from there?

    [5] Has anyone tried to disprove the current assessment of the temperature record other than McKitrick and his criticisms of the Hockey Stick. So much work is required to deal with that data, why would anyone bother with it unless they were a very dogged critic or a committed modeler.

    [6]The only justifications I saw for various assessments of RF and CS were the GCMs and the the comparison of GCM output to the temperature record…is there another?

    [7] Engineers don’t care about fundamental truths of nature, they just want things to work. So, in studying materials, say, they are quite willing to massage data into a homogenous series if that can be used for a model that will tell them how to build a stronger column, for example. They know the result they want, and it’s easy to test if they have gotten it. Column stands, column buckles. They are willing to “improve” the data as much as necessary to get the result. How is this different from GCM modelers “improving” their statistical refinement of the temperature record and then improving models?

    [8] I have done a lot of watershed modeling. It works as long as you only want to know about whether there will be more, a lot more, or less, a lot less water than before. The GCMs remind me a lot of the classic hydrological modeling problem – The Ungauged Watershed. It goes like this:

    Watershed A has characteristics a, b, c, d e.g. area, average slope, land cover, soil type, etc. A long-standing flow gauge is available at the outlet, and complete rainfall records are available.

    Watershed B is similar, but not identical to A, and also has rainfall records, but no flow gauge. What will the annual flow pattern be out of Watershed B.

    The fact is, after years of model development, we just don’t know. At least, we don’t know to within 1%, or 5%, or probably not 10% and maybe not even to 25%. It’s just too complex to get right. And it’s much simpler than the climate system. There’s not even any dispute about the base data!

    [1] “Doesn’t GIGO apply?” Please elaborate.

    [2] Of course if the temp. record does not correspond to the conceptual/theoretical framework, the theory must be revised/corrected.

    [3] As noted above, at the moment we’re discussing the theoretical/conceptual framework of the system as a whole. It is reductionist by nature, in order to help make sense out of the various inputs.

    [4] Yes, scientific studies are based upon hypotheses. The IPCC however is essentially a review article of the current state of knowledge on the topic, and is by nature normative (see Kuhn).

    [5] Sure, or at least new research studies continually frame their hypotheses as critical tests of the conceptual/theoretical framework. Again, this is science: as much as I dislike Kuhn, it fits nicely with Kuhnsian paradigms. (Or maybe Imre Lakatos’ “research programmes” better).

    [6] Justification for RF assessments, as I understand it, has been derived from calculations that I admit I have not examined. They go back to G.S. Callendar (1938) (see above).

    [7] As alluded to in response to #5, this is a question addressed by Kuhnsian paradigms and such: theories are usually just refined until the core of the theory is confronted with insurmountable obstacles. At that point, the field goes through a revolution of sorts. (Nobody said science was perfect; it has to “bootstrap” its way to greater precision.) Moreover, this is why I chose to discuss RF’s Re:Chapter2, as it seems to be the core of the theoretical basis for contemporary climate science. Judging from the chapter order, I’d say the editors of the IPCC reports agree.

    [8] Again, this is the nature of the beast. Theories aren’t perfect, and there will always be problems of precision.

    Bottom line: I think these issues are hardly a basis for discarding the current framework. We (okay, climatologists and geophysicists) must address the issues relating to the core of climate theory in the interests of science. And they do, to the extent that any field addresses decades-old knowledge – most is accepted implicitly, because new data “fits” (within a degree of precision accepted by the scientists) with the fundamental principles described in places like the quoted excerpt of this post, FAQ1.1, and other summaries shown here and there throughout the IPCC text.

    *See also the Bayesian Probability Theorem for a less iconic but (IMO) better model of science than Kuhn.

  3. More on the core of current theory in the field of climatology and geophysics…

    The key here seems to be finding a causal link between the known spectra of radiative absorption of long-lived greenhouse gases and observed warming. Afterall, we agree that both phenomenon are occurring and can be correlated, but can we demonstrate the causal link?

    Unfortunately, the IPCC-ar4 really only cites a very small number of studies on this, and discusses it directly only on page 153, section 2.3.8, “Observations of Long-Lived GreenhouseGas Radiative Effects”. The experimental data supporting causation is described as:

    Observations of the clear-sky radiation emerging at the top of the atmosphere and at the surface have been conducted. Such observations, by their nature, do not measure RF as defined here. Instead, they yield a perspective on the influence of various species on the transfer of radiation in the atmosphere. Most importantly, the conditions involved with these observations involve varying thermal and moisture profiles in the atmosphere such that they do not conform to the conditions underlying the RF definition (see Section 2.2). There is a more comprehensive discussion of observations of the Earth’s radiative balance in Section 3.4.

    Harries et al. (2001) analysed spectra of the outgoing longwave radiation as measured by two satellites in 1970 and 1997 over the tropical Pacific Ocean. The reduced brightness temperature observed in the spectral regions of many of the greenhouse gases is experimental evidence for an increase in the Earth’s greenhouse effect. In particular, the spectral signatures were large for CO2 and CH4. The halocarbons, with their large change between 1970 and 1997, also had an impact on the brightness temperature. Philipona et al. (2004) found an increase in the measured longwave downward radiation at the surface over the period from 1995 to 2002 at eight stations over the central Alps. A significant increase in the clear-sky longwave downward flux was found to be due to an enhanced greenhouse effect after combining the measurements with model calculations to estimate the contribution from increases in temperature and humidity. While both types of observations attest to the radiative influences of the gases, they should not be interpreted as having a direct linkage to the value of RFs in Section 2.3.

    What I don’t understand is that last sentence. Why wouldn’t changes in radiative influences of the gases be directly linked to radiative forcings of those same gases?

  4. I believe we are still talking past one another to a great extent. The main reason, I believe, is your notion of consensus, or paradigm, or whatever term you prefer. I don’t want to get into a complex discussion of the sociology of knowledge, but I agree it is an issue here, however, I think it relevant in the opposite way you do. That is, I think we have a temporary fad-culture here that has displaced scientific culture, not a fringe challenge to a well-grounded scientific theoretical edifice. Science has fads too. Usually, they get weeded out quickly, but climate is such an huge and intensely inter-disciplinary subject, that this one has lingered.

    I’ll just stick to more focused comments.

    They spend a great deal of time on the anthropogenic forcings, true. But they don’t ignore the natural forcings. I take 1750 to be a starting point to improving measurements, not necessarily a baseline.

    Starting point vs. baseline…hard to see the difference. And it still assumes that things in the system have changed significantly after 1750, i.e. from human agency, which is the point to be proven. Of course, natural forcings always are present – that’s the nature of Nature.

    I’m going to have to go back and review Chapter 1 again.

    [1]GIGO: If RF and CS are developed from models and comparisons to the temperature record, and if the temperature record we have is NOT ACCURATE, then we are feeding garbage into our calculations and garbage will come out. More below…

    [2] Of course if the temp. record does not correspond to the conceptual/theoretical framework, the theory must be revised/corrected.

    Not my point! What if the temperature record does not correspond to the reality of the history of the temperature? As an absurd example, what if it were discovered that every thermometer used in the USA over the last 100 years had been improperly manufactured so that they gave inflated readings? Of course, I don’t believe any such thing, but since we’re talking about trends of a couple of degrees over many decades, a small bias, error, improper homogenization procedure, etc. could be VERY important.

    My points is that the temperature record is taken by IPCC as “settled.” It is no such thing. First of all, there is NO global temperature, there are only local temperatures; second, there is much dispute about how that global temperature plot was produce, about the use of specific proxies for actual temperature measurements, about the proper way to “adjust” for the urban heat island effect (which Hansen claims is long settled) and whether or not the network of stations all over the world comes up to the imperfect standards of the ones in NOAA.

    [3] As noted above, at the moment we’re discussing the theoretical/conceptual framework of the system as a whole.

    I really don’t get this at all. I am questioning the creation of a data time-series that is required for the running of models and their evaluation. That is a preliminary step to building a theoretical framework, or it should be. It should be relatively neutral regarding the hypoetheses in question. If we can’t agree on the data, the givens, where are we? And we DON’T. That’s my point! You can’t just say, “Let’s not discuss that – we have to be reductionist to a certain degree…”

    [4]The IPCC …is by nature normative (see Kuhn)…

    I don’t expect the IPCC report to question the fundamental basis of its own approach. I expect it to present its approach as correct, based on the scientific work behind it, and this is what it does. I however, DO question the fundamental basis of the approach. I question the use of computer models as a basis for drawing the conclusions they draw. (I have too much first hand experience with modelers to feel comfy with that…) For a nice, general treatment of why mathematical models are of limited utility in the realm of environmental analysis, check out Pilkey’s “Useless Arithmetic.” The models he discusses aren’t as complex as GCMs, but I’m not sure that this doesn’t make his arguments stronger rather than weaker in this discussion.

    [5]Sure, or at least new research studies continually frame their hypotheses as critical tests of the conceptual/theoretical framework.

    Talking about disproof or falsification of the temperature record is stretching the usage a bit, because as I said in [4] the base data shouldn’t really imply a hypothesis. I was talking about constructing an alternative temperature record from the same station values. If it is at all plausible, this would cast doubt on the one being used – how to choose which one? This is what McKitrick did. There should be more of that! Weather station data, tree ring data…this isn’t stuff collected in laboratories by trained scientists following all the standard procedures (which sometimes doesn’t happen anyway, even in physics labs, right…?)

    [6] nothing here…

    [7] Again, I don’t want to get into parsing Kuhn and his critics. More directly relevant, Weart, the triumphalist, basically takes your view. Using GCMs is the “norm.” I read a book nearly twenty years ago on climate change – It was called “Ice Time.” (Ice ages were feared then.) It’s a popular book, not a science treatise, but I looked at it recently and was surprised at what I found. He devotes a chapter to discussing climate modelers, in about 1986. What impressed me in my second reading was the interviews with scientists that he recounts. Many stated a feeling of deep unease with computer modeling and the usefulness or truthfulness of the results. GCMs were not normative then. Has it changed so quickly? Or has the fad simply triumphed for the nonce? Or is the publicity machine at fault? Or is the funding mechanisms? Computer modeling makes such a perfect grant proposal, so neat and tidy, simple deliverables.

    The firm I used to work at was founded by a man who is regarded as the father of mathematical modeling in the field of water quality assessments, at least in the USA. He was frequently heard to say that he felt the advent of computing power, modeling scores of state variables at time-steps of minutes and hours, had not materially advanced our assessment of the problems beyond what a back of the envelope calculation could achieve. Why should we accept GCMs as the normative paradigm?

    [8]…and there will always be problems of precision.

    When you’re claiming to predict with 90% certainty that most of the claimed warming is due to human activity and that this will result in a “global” temperature increase of a few degrees over decades, you sure better be VERY PRECISE in your calculations. Otherwise, it reminds me of the judge who said, “I’ve hanged innocent men, to be sure, and I’ve let guilty ones go free, no doubt. It all evens out.”

  5. Well, first off, if you think that current trends in climatology and geophysics are a “fad”, then we’re going to get frustrated with each other pretty quickly. The idea that modern climatology is 180 degrees backward is incomprehensible to me – think of all of the data by countless researchers that would have to be not just imprecise, but falsified.

    And all of your complaints appear to be about lack of precision in the measurements. Besides the fact that imprecision could just as easily mean that the models are too modest as they are too exaggerated, what does that tell us? Nothing of any use. In a complex system and in every field of science, one must take the findings that they have, imperfect precision and all, and go on to see if corroborating data is out there.

    So, do you have ANY contravening data to discuss? Anything at all that I can learn from, if the theoretical framework which I accept is not just imprecise, but flat-out wrong???

  6. Fad’s a strong word. I didn’t mean that current trends in climatology and geophysics are all junk. I meant that the conclusions that are being drawn by workers in those areas on certain questions may be, to a large extent, the product of a fad. That is, a heightened concern about some issue mixed with media hype, funding, and some plausible but not rock-solid science. I am not out to substitute, or equate, Astrology for Astronomy, okay? (Like the ID advocate, Behe.)

    I never said it was 180 degrees backwards. I said I’m not convinced by the arguments about the effect of CO2 on current alleged climate trends. I think that’s a pretty conservative and reasonable way to express skepticism. I remain recusant – I withold assent.

    My complaint about precision is that the absolute scale of change predicted over a long period is quite small, especially considering the quality of the data input into the calculations. I worry about compounded errors. I know they try to assess this themselves, but I am concerned about confirmation bias, i.e., “we KNOW the temperature record is settled, therefore…” in favor of their approach.

    If the uncertainty and imprecision in the inputs and outputs is great enough, the outputs are just noise. The AGW points is that there is a “clear signal” an upward trend. The imprecision is always characterized as greater or lesser degrees of increase in global temperature. I’ve not seen a report that says GCM output could imply a downward trend. So, you are correct, they models could be too modest or too strong – that is clear from the reports on their results. I am concerned that they are just WRONG.

    So, do you have ANY contravening data to discuss?

    I would raise these items as starting points for futher enquiry:

    Melting of permafrost often cited as proof of global warming. Recent study by Alaskan geologist indicates that warming is not a plausible cause for observed melting of permafrost.

    Criticisms of Hockey Stick. I don’t think this issue is resolved. If the Hockey Stick graph is not statistically valid, what argument is there?

    Ice cores show – everyone agrees on this – that CO2 lags temperature rise. This is prima facie evidence against the AGW view. They have a plausible “positive feedback” explanation of this, but in the absence of independent proof, the lag is contravening evidence.

    Paleo climate analysis shows that CO2 concentrations have been an order of magnitude higher than they are now – temperature has not maintained proportionality. There is correllation at times, but…

    The temperature record, the record, the record…! Consider the possibility that the heat island effect has NOT been properly adjusted. Consider the new possibility, only now being thoroughly investigated, that the network of monitorig stations is biased upwards a degree, not because of urban heat island, but because of bad practice and changes in land use. Consider that these could be magnified several times in some areas of the world. That would comprise imprecision of a magnitude to give one pause, would it not?

    Your remark,

    [do I have anything]…I can learn from, if the theoretical framework which I accept is not just imprecise, but flat-out wrong???

    seems to indicate this attitude:

    The data I’m working with may be bad. It’s all we have. The models we have may not be right, they may be missing something, the coefficients and “parameterizations” may be more like fudge factors, but they seem to match the data. It’s all we have. Not precise, but we have to do something after all…

    Maybe a good enough basis for policy makers who have to decide…NOW. Maybe. But it’s not science.

    Another approach would be this:

    Things are changing. Things are always changing. It is warmer now in a lot of places. Temperatures go up and down, but this warming trend could continue. That would have consequences with which we must deal. GHG as a cause seems plausible, but on the basis of the data we have to input to our models, the changes are relatively small, and the degree of uncertainty is difficult to compute, producing widely varying estimates. Lots of modelers have been using different GCMs, and they all predict a warming tend do to human forcings, but these models are imperfectly calibrated to the data record which has its own imperfections. There is no fundamental experimental control for them as they are continually tweaked to provide better hind-casts. Finally, as financial planners and Paul Erlich know, the future need not be like the past.

  7. Ok – sorry for reading into your mentioning of fad. I have to admit, I thought you were going all conspiracy theorist on me. I screwed up there.

    Thanks for the items for further study. I’d especially like very much to have a look at the permafrost study.

    I think I have enough on your other points to go back and check into them, and get back to you.

    And you also say:

    seems to indicate this attitude:

    The data I’m working with may be bad. It’s all we have. The models we have may not be right, they may be missing something, the coefficients and “parameterizations” may be more like fudge factors, but they seem to match the data. It’s all we have. Not precise, but we have to do something after all…

    No measurement in science has 100% certainty. I’m rather disheartened by your expectation of certainty – how can I have a discussion with you about science if that is your attitude?

    I’ll get back to you on the other items.

  8. On Permafrost:

    http://adsabs.harvard.edu/abs/2007EOSTr..88..522O

    Osterkamp (2007) is the scientist – that’s all he writes about…

    On Certainty:

    Of course I don’t expect 100% certainty. I suspect and fear that the sensitivity of the calculations and models to the actual level of uncertainty in the base data is greater than most of the predicted changes.

    You know that if you gave people thermometers that only measured to 0.10 degree and sent them out to monitor sites for years, and that they came back and told you that the average temperature was likely to be 76.03 degrees on any given day, that you would just laugh at them and say, “Oh, you mean 76 degrees, don’t you?” That’s analogous to what I’m talking about.

    I am NOT into conspiracy theories, period:

    http://iamyouasheisme.wordpress.com/2006/09/11/spinoza-on-the-essence-of-conspiracies/

  9. Okay, but wait – two issues I have with that.

    First, most of the predicted changes are far in excess of the actual level of certainty you provide in your example (plus/minus 0.1 degree). That’s the level of uncertainty that you fear???

    Second, if you notice in the fourth assessment report, including the chapter on models, they go to great lengths to state up-front the degrees of confidence in both measurements and model outcomes, whereever possible.

    I just don’t get this “precision” problem you have. It sounds as if you’re saying “Look, their predictions of 0.5 degree change in half a century are likely off by as much as 0.1 degree! They must be garbage!”

    Still working on the other items you gave me, and thanks for the permafrost paper. I plan to take my time though and do the effort you put in selecting them justice.

  10. Lichanos,
    my response in PDF format to your issues for further inquiry.

  11. Sorry, I just realized that the hyperlinks didn’t come through in the PDF. The two links are:

    http://www.co2science.org/articles/V11/N6/C2.php

    http://books.nap.edu/openbook.php?record_id=11676&page=R1

  12. On why Lichanos is winning the public….

    I was visiting this site only to get the name of the scientist who did the classic experiments on GHG and find this thread fascinating.

    First, on free convection… I have posted the URL where I document how installing a solar water heater increased my carbon footprint through unforeseen thermal siphon effects. I did finally figure out the problem through modeling. I agree with the statement that foreward models are typically nonsense (and the IPCC recognizes this also with their suite of models… called multiple history matching in my line of work) for prediction but are very useful for understanding what happened in the past.

    The only statement in what Lichanos has written that has been disproven is the impact on the improperly designed measurement stations. Turns out these show less warming (makes sense… they are more controlled by anthropogenic sources, which are more constant). Otherwise, his arguments are valid.

    The Al Gore program has people scared. Well, exaggerate with a P(1) scenario and wait a while and see what happens. Folks don’t believe things. The weather in the US isn’t helping much either, but Mother Nature is fickle. You know most Americans… don’t really pay attention to the rest of the world so if the US is going through a cold spell (and you can include GB in this too) then the Anglocentricism will lead to the conclusion that therefore it is true for the rest of the world.

    Anyway, China has the correct idea. Reduce energy intensity NOW. My FAMILY’s carbon footprint is 10 tonnes per year and everything I did will pay out handsomely except for the stupid solar water heating. Run, tackle, block.

    There is no reason why the United States cannot reduce energy use by 50%. No reason whatsoever.

    Or do you like foreign oil?

    Methinks that if that aspect were better highlighted than all the rest of the Al Gore nonsense we would make more progress.

    After all, a radical reduction in energy use through efficiency measures WILL reduce GHG’s.

    (A conservative who accepts the logical conclusions of the Fourier and Arrhenius work)

  13. Hi Thom,
    I couldn’t agree more. Indeed, the problem isn’t that Lichanos was incorrect, it was that he was conveniently presenting those things in the absence of other key information, presenting a skewed picture. And yes, the other part of the problem is that many Americans hear that temperatures are colder than average almost half of the time, completely forgetting what an “average” is.


Categories

%d bloggers like this: