AGW Bombshell? A new paper shows statistical tests for global warming fails to find statistically significantly anthropogenic forcing

graphic_esd_cover_homepage[1]From the journal Earth System Dynamics billed as “An Interactive Open Access Journal of the European Geosciences Union” comes this paper which suggests that the posited AGW forcing effects simply isn’t statistically significant in the observations, but other natural forcings are.

“…We show that although these anthropogenic forcings share a common stochastic trend, this trend is empirically independent of the stochastic trend in temperature and solar irradiance. Therefore, greenhouse gas forcing, aerosols, solar irradiance and global temperature are not polynomially cointegrated. This implies that recent global warming is not statistically significantly related to anthropogenic forcing. On the other hand, we find that greenhouse gas forcing might have had a temporary effect on global temperature.”

This is a most interesting paper, and potentially a bombshell, because they have taken virtually all of the significant observational datasets (including GISS and BEST) along with solar irradiance from Lean and Rind, and CO2, CH4, N2O, aerosols, and even water vapor data and put them all to statistical tests (including Lucia’s favorite, the unit root test) against forcing equations. Amazingly, it seems that they have almost entirely ruled out anthropogenic forcing in the observational data, but allowing for the possibility they could be wrong, say:

“…our rejection of AGW is not absolute; it might be a false positive, and we cannot rule out the possibility that recent global warming has an anthropogenic footprint. However, this possibility is very small, and is not statistically significant at conventional levels.”

I expect folks like Tamino (aka Grant Foster) and other hotheaded statistics wonks will begin an attack on why their premise and tests are no good, but at the same time I look for other less biased stats folks to weigh in and see how well it holds up. My sense of this is that the authors of Beenstock et al have done a pretty good job of ruling out ways they may have fooled themselves. My thanks to Andre Bijkerk and Joanna Ballard for bringing this paper to my attention on Facebook.

The abstract and excerpts from the paper, along with link to the full PDF follows.

Polynomial cointegration tests of anthropogenic impact on global warming

M. Beenstock1, Y. Reingewertz1, and N. Paldor2
1Department of Economics, the Hebrew University of Jerusalem, Mount Scopus Campus, Jerusalem, Israel
2Fredy and Nadine Institute of Earth Sciences, the Hebrew University of Jerusalem, Edmond J. Safra campus, Givat Ram, Jerusalem, Israel

 Abstract. 

We use statistical methods for nonstationary time series to test the anthropogenic interpretation of global warming (AGW), according to which an increase in atmospheric greenhouse gas concentrations raised global temperature in the 20th century. Specifically, the methodology of polynomial cointegration is used to test AGW since during the observation period (1880–2007) global temperature and solar irradiance are stationary in 1st differences whereas greenhouse gases and aerosol forcings are stationary in 2nd differences. We show that although these anthropogenic forcings share a common stochastic trend, this trend is empirically independent of the stochastic trend in temperature and solar irradiance. Therefore, greenhouse gas forcing, aerosols, solar irradiance and global temperature are not polynomially cointegrated. This implies that recent global warming is not statistically significantly related to anthropogenic forcing. On the other hand, we find that greenhouse gas forcing might have had a temporary effect on global temperature.

Introduction

Considering the complexity and variety of the processes that affect Earth’s climate, it is not surprising that a completely satisfactory and accepted account of all the changes that oc- curred in the last century (e.g. temperature changes in the vast area of the Tropics, the balance of CO2 input into the atmosphere, changes in aerosol concentration and size and changes in solar radiation) has yet to be reached (IPCC, AR4, 2007). Of particular interest to the present study are those  processes involved in the greenhouse effect, whereby some of the longwave radiation emitted by Earth is re-absorbed by some of the molecules that make up the atmosphere, such as (in decreasing order of importance): water vapor, car- bon dioxide, methane and nitrous oxide (IPCC, 2007). Even though the most important greenhouse gas is water vapor, the dynamics of its flux in and out of the atmosphere by evaporation, condensation and subsequent precipitation are not understood well enough to be explicitly and exactly quantified. While much of the scientific research into the causes of global warming has been carried out using calibrated gen- eral circulation models (GCMs), since 1997 a new branch of scientific inquiry has developed in which observations of climate change are tested statistically by the method of cointegration (Kaufmann and Stern, 1997, 2002; Stern and Kauf- mann, 1999, 2000; Kaufmann et al., 2006a,b; Liu and Ro- driguez, 2005; Mills, 2009). The method of cointegration, developed in the closing decades of the 20th century, is intended to test for the spurious regression phenomena in non-stationary time series (Phillips, 1986; Engle and Granger, 1987). Non-stationarity arises when the sample moments of a time series (mean, variance, covariance) depend on time. Regression relationships are spurious1 when unrelated non- stationary time series appear to be significantly correlated be- cause they happen to have time trends.

The method of cointegration has been successful in detecting spurious relationships in economic time series data.

Indeed, cointegration has become the standard econometric tool for testing hypotheses with nonstationary data (Maddala, 2001; Greene, 2012). As noted, climatologists too have used cointegration to analyse nonstationary climate data (Kauf- mann and Stern, 1997). Cointegration theory is based on the simple notion that time series might be highly correlated even though there is no causal relation between them. For the relation to be genuine, the residuals from a regression between these time series must be stationary, in which case the time series are “cointegrated”. Since stationary residuals mean- revert to zero, there must be a genuine long-term relationship between the series, which move together over time because they share a common trend. If on the other hand, the resid- uals are nonstationary, the residuals do not mean-revert to zero, the time series do not share a common trend, and the relationship between them is spurious because the time series are not cointegrated. Indeed, the R2 from a regression between nonstationary time series may be as high as 0.99, yet the relation may nonetheless be spurious.

The method of cointegration originally developed by En- gle and Granger (1987) assumes that the nonstationary data are stationary in changes, or first-differences. For example, temperature might be increasing over time, and is there- fore nonstationary, but the change in temperature is station- ary. In the 1990s cointegration theory was extended to the case in which some of the variables have to be differenced twice (i.e. the time series of the change in the change) be- fore they become stationary. This extension is commonly known as polynomial cointegration. Previous analyses of the non-stationarity of climatic time series (e.g. Kaufmann and Stern, 2002; Kaufmann et al., 2006a; Stern and Kaufmann, 1999) have demonstrated that global temperature and solar irradiance are stationary in first differences, whereas green- house gases (GHG, hereafter) are stationary in second differ- ences. In the present study we apply the method of polyno- mial cointegration to test the hypothesis that global warming since 1850 was caused by various anthropogenic phenom- ena. Our results show that GHG forcings and other anthropogenic phenomena do not polynomially cointegrate with global temperature and solar irradiance. Therefore, despite the high correlation between anthropogenic forcings, solar irradiance and global temperature, AGW is not statistically significant. The perceived statistical relation between tem- perature and anthropogenic forcings is therefore a spurious regression phenomenon.

Data and methods

We use annual data (1850–2007) on greenhouse gas (CO2, CH4 and N2O) concentrations and forcings, as well as on forcings for aerosols (black carbon, reflective tropospheric aerosols). We also use annual data (1880–2007) on solar irradiance, water vapor (1880–2003) and global mean tem- perature (sea and land combined 1880–2007). These widely used secondary data are obtained from NASA-GISS (Hansen et al., 1999, 2001). Details of these data may be found in the Data Appendix.

We carry out robustness checks using new reconstructions for solar irradiance from Lean and Rind (2009), for globally averaged temperature from Mann et al. (2008) and for global land surface temperature (1850–2007) from the Berkeley Earth Surface Temperature Study.

Key time series are shown in Fig. 1 where panels a and b show the radiative forcings for three major GHGs, while panel c shows solar irradiance and global temperature. All these variables display positive time trends. However, the time trends in panels a and b appear more nonlinear than their counterparts in panel c. Indeed, statistical tests reported be- low reveal that the trends in panel c are linear, whereas the trends in panels a and b are quadratic. The trend in solar irradiance weakened since 1970, while the trend in temperature weakened temporarily in the 1950s and 1960s.

The statistical analysis of nonstationary time series, such as those in Fig. 1, has two natural stages. The first consists of unit root tests in which the data are classified by their order and type of nonstationarity. If the data are nonstationary, sample moments such as means, variances and co- variances depend upon when the data are sampled, in which event least squares and maximum likelihood estimates of parameters may be spurious. In the second stage, these nonstationary data are used to test hypotheses using the method of cointegration, which is designed to distinguish between genuine and spurious relationships between time series. Since these methods may be unfamiliar to readers of Earth System Dynamics, we provide an overview of key concepts and tests.

clip_image002

Fig. 1. Time series of the changes that occurred in several variables that affect or represent climate changes during the 20th century. a) Radiative forcings (rf, in units of W m−2) during 1880 to 2007 of CH4 (methane) and CO2 (carbon dioxide); (b) same period as in panel a but for Nitrous-Oxide (N2O); (c) solar irradiance (left ordinate, units of W m−2) and annual global temperature (right ordinate, units of ◦C) during 1880–2003.

[…]

3 Results

3.1 Time series properties of the data

Informal inspection of Fig. 1 suggests that the time series properties of greenhouse gas forcings (panels a and b) are visibly different to those for temperature and solar irradiance (panel c). In panels a and b there is evidence of acceleration, whereas in panel c the two time series appear more stable. In Fig. 2 we plot rfCO2 in first differences, which confirms by eye that rfCO2 is not I (1), particularly since 1940. Similar figures are available for other greenhouse gas forcings. In this section we establish the important result that whereas the first differences of temperature and solar irradiance are trend free, the first differences of the greenhouse gas forcings are not. This is consistent with our central claim that anthropogenic forcings are I (2), whereas temperature and solar irradiance are I (1).

image

Fig. 2. Time series of the first differences of rfCO2.

What we see informally is born out by the formal statistical tests for the variables in Table 1.

image

Although the KPSS and DF-type statistics (ADF, PP and DF-GLS) test different null hypotheses, we successively increase d until they concur. If they concur when d = 1, we classify the variable as I (1), or difference stationary. For the anthropogenic variables concurrence occurs when d = 2. Since the DF-type tests and the KPSS tests reject that these variables are I (1) but do not reject that they are I (2), there is no dilemma here. Matters might have been different if according to the DF-type tests these anthropogenic variables are I (1) but according to KPSS they are I (2).

The required number of augmentations for ADF is moot. The frequently used Schwert criterion uses a standard formula based solely on the number of observations, which is inefficient because it may waste degrees of freedom. As mentioned, we prefer instead to augment the ADF test until its residuals become serially independent according to a la- grange multiplier (LM) test. In most cases 4 augmentations are needed, however, in the cases of rfCO2, rfN2O and stratospheric H2O 8 augmentations are needed. In any case, the classification is robust with respect to augmentations in the range of 2–10. Therefore, we do not think that the number of augmentations affects our classifications. The KPSS and Phillips–Perron statistics use the standard nonparametric Newey-West criteria for calculating robust standard errors. In practice we find that these statistics use about 4 autocorrelations, which is similar to our LM procedure for determining the number of augmentations for ADF.

[…]

Discussion

We have shown that anthropogenic forcings do not polynomially cointegrate with global temperature and solar irradiance. Therefore, data for 1880–2007 do not support the anthropogenic interpretation of global warming during this period. This key result is shown graphically in Fig. 3 where the vertical axis measures the component of global temperature that is unexplained by solar irradiance according to our estimates. In panel a the horizontal axis measures the anomaly in the anthropogenic trend when the latter is derived from forcings of carbon dioxide, methane and nitrous oxide. In panel b the horizontal axis measures this anthropogenic anomaly when apart from these greenhouse gas forcings, it includes tropospheric aerosols and black carbon. Panels a and b both show that there is no relationship between temperature and the anthropogenic anomaly, once the warming effect of solar irradiance is taken into consideration.

However, we find that greenhouse gas forcings might have a temporary effect on global temperature. This result is illustrated in panel c of Fig. 3 in which the horizontal axis measures the change in the estimated anthropogenic trend. Panel c clearly shows that there is a positive relationship between temperature and the change in the anthropogenic anomaly once the warming effect of solar irradiance is taken into consideration.

clip_image002[6]

Fig. 3. Statistical association between (scatter plot of) anthropogenic anomaly (abscissa), and net temperature effect (i.e. temperature minus the estimated solar irradiance effect; ordinates). Panels (a)(c) display the results of the models presented in models 1 and 2 in Table 3 and Eq. (13), respectively. The anthropogenic trend anomaly sums the weighted radiative forcings of the greenhouse gases (CO2, CH4 and N2O). The calculation of the net temperature effect (as defined above) change is calculated by subtracting from the observed temperature in a specific year the product of the solar irradiance in that year times the coefficient obtained from the regression of the particular model equation: 1.763 in the case of model 1 (a); 1.806 in the case of model 2 (b); and 1.508 in the case of Eq. (13) (c).

Currently, most of the evidence supporting AGW theory is obtained by calibration methods and the simulation of GCMs. Calibration shows, e.g. Crowley (2000), that to explain the increase in temperature in the 20th century, and especially since 1970, it is necessary to specify a sufficiently strong anthropogenic effect. However, calibrators do not re- port tests for the statistical significance of this effect, nor do they check whether the effect is spurious. The implication of our results is that the permanent effect is not statistically significant. Nevertheless, there seems to be a temporary anthropogenic effect. If the effect is temporary rather than permanent, a doubling, say, of carbon emissions would have no long-run effect on Earth’s temperature, but it would in- crease it temporarily for some decades. Indeed, the increase in temperature during 1975–1995 and its subsequent stability are in our view related in this way to the acceleration in carbon emissions during the second half of the 20th century (Fig. 2). The policy implications of this result are major since an effect which is temporary is less serious than one that is permanent.

The fact that since the mid 19th century Earth’s temperature is unrelated to anthropogenic forcings does not contravene the laws of thermodynamics, greenhouse theory, or any other physical theory. Given the complexity of Earth’s climate, and our incomplete understanding of it, it is difficult to attribute to carbon emissions and other anthropogenic phenomena the main cause for global warming in the 20th century. This is not an argument about physics, but an argument about data interpretation. Do climate developments during the relatively recent past justify the interpretation that global warming was induced by anthropogenics during this period? Had Earth’s temperature not increased in the 20th century despite the increase in anthropogenic forcings (as was the case during the second half of the 19th century), this would not have constituted evidence against greenhouse theory. However, our results challenge the data interpretation that since 1880 global warming was caused by anthropogenic phenomena.

Nor does the fact that during this period anthropogenic forcings are I (2), i.e. stationary in second differences, whereas Earth’s temperature and solar irradiance are I (1), i.e. stationary in first differences, contravene any physical theory. For physical reasons it might be expected that over the millennia these variables should share the same order of integration; they should all be I (1) or all I (2), otherwise there would be persistent energy imbalance. However, during the last 150 yr there is no physical reason why these variables should share the same order of integration. However, the fact that they do not share the same order of integration over this period means that scientists who make strong interpretations about the anthropogenic causes of recent global warming should be cautious. Our polynomial cointegration tests challenge their interpretation of the data.

Finally, all statistical tests are probabilistic and depend on the specification of the model. Type 1 error refers to the probability of rejecting a hypothesis when it is true (false positive) and type 2 error refers to the probability of not rejecting a hypothesis when it is false (false negative). In our case the type 1 error is very small because anthropogenic forcing is I (1) with very low probability, and temperature is polynomially cointegrated with very low probability. Also we have experimented with a variety of model specifications and estimation methodologies. This means, however, that as with all hypotheses, our rejection of AGW is not absolute; it might be a false positive, and we cannot rule out the possibility that recent global warming has an anthropogenic footprint. However, this possibility is very small, and is not statistically significant at conventional levels.

Full paper: http://www.earth-syst-dynam.net/3/173/2012/esd-3-173-2012.pdf

Data Appendix.

image

This entry was posted in climate data, Science and tagged , , . Bookmark the permalink.

151 Responses to AGW Bombshell? A new paper shows statistical tests for global warming fails to find statistically significantly anthropogenic forcing

  1. Jimmy Haigh says:

    This looks just like what those of us with common sense have been saying for donkey’s years: it’s basically got nothing to do with us.

  2. John Peter says:

    And the latest UAH measurements show a further steady decline for December to +0.20 degree C
    http://www.drroyspencer.com/2013/01/uah-v5-5-global-temperature-update-for-december-2012-0-20-deg-c/

  3. Stephen Richards says:

    “Indeed, the R2 from a regression between nonstationary time series may be as high as 0.99, yet the relation may nonetheless be spurious”

    So, two parameters may corelate to 99% but still not be cause and effect ?

    I like this paper. They have detailed their method(s), the data, their caveats, their uncertainties and provided conclusions actually based on the the body of the paper. It’s not like a Crimatology paper at all.

  4. mpainter says:

    Yes, hardly a bombshell, just a new statistical appraisal of the record with the obvious conclusion.

  5. Stephen Richards says:

    Panel c clearly shows that there is a positive relationship between temperature and the change in the anthropogenic anomaly once the warming effect of solar irradiance is taken into consideration

    That’s interesting because, it seems to me to say that radiative equilibrium is a very fast process but I still find it difficult to understand how the CO² effect can similate a ‘burst’ of heat energy which then fades at the same time that CO² concentration is steadily increasing.

  6. Vince Causey says:

    Does look rather technical. I would be interested in what Steve McIntyre has to say on their methods – or anyone else with good stats knowledge.

  7. Stephen Richards says:

    mpainter says:

    January 3, 2013 at 9:15 am
    Yes, hardly a bombshell, just a new statistical appraisal of the record with the obvious conclusion.

    I’m not sure that all their conclusions are ‘obvious’. The paper is ‘quite’ unique in it’s analytical method and very thorough.

  8. Brent Buckner says:

    David Stockwell has been following this paper since earlier versions (most recently http://landshape.org/enm/agw-doesnt-cointegrate-beenstocks-challenging-analysis-published/ ).

    Thanks for drawing wider attention to it.

  9. Peter Miller says:

    So,in summary:

    1. There are too many unknowns to accurately model the Earth’s climate.

    2. Man may, or may not, have been responsible for part of the ~0.7 degrees C warming over the past century.

    3. The effect of man on our planet’s climate is small and temporary.

    Makes sense to me, but these reasonable concepts will give your average climate modeller and/or CAGW cult member a total hissy fit.

    I like the fact they use Mann as a reference – I am not sure in what context, but my guess would be to demonstrate statistical BS.

  10. Stephen Richards says:

    I like this paper. For the first time in years, since I studied Lamb’s work, I want to read, inwardly digest and understand a paper on climate. It sure beats the crap out of the BBC-Grauniad journal.

  11. Not only statistically insignificant, not significant by any other measure either. This is all still nothing more then masturbating with in the error band. Since it demonstrates that I guess us AGW atheists will all say amen.

  12. Juraj V. says:

    Commons sense says, that there is no difference at all between 1910-1940 warming and 1975-2005 warming, which was even smaller.

  13. BioBob says:

    @ Vince Causey
    Don’t expect to get the bottom line on stats because the reality is always ignored but brutally simple.

    Virtually ALL land temperature data prior to the age of electronics (and the vast majority since) is worthless for use in statistics since it is all based on single non-random non-replicated samples (n=1 sample per day). N = 1 equals data with unknown variance and unknown error in which virtually all parametric statistical requirements are NOT met. Even a mean is worthless since the shape of real temperature values are unknown and each days temperature population is NOT the same as any other’s.

    You don’t need a statistics expert – you only need to look up the known requirements for any particular stats type to see the truth.

  14. philjourdan says:

    It is always a pleasure to see real science being conducted.

  15. Kevin Kilty says:

    …; it might be a false positive…

    The authors mean by this the tentative acceptance of the null hypothesis when it is in fact not true. So the meaning appears backward to my way of thinking–it sure is for medical diagnostic tests.

    I’m pretty sure there is an anthropogenic component, but it is so small it is lost in the other noise–and this belief animates my appraisal of the issue. Lots of pathological (Irving Langmiur’s term) scientific results originated in trying to interpret a tiny signal that was drowning in noise. By the way, AGW as presented publically itself contains several elements from Langmuir’s list, most prominently the use of ad hoc argumentation to dismiss contrary results.

  16. PRD says:

    I heard Mike M. and James H. both say, “Ouch”.

  17. Jpatrick says:

    “…along with solar irradiance from Lean and Rind, and CO2, NH4, N20…”
    I think you mean CH4 instead of NH4

    REPLY: yup. fixed

  18. Jim Cripwell says:

    Is this not just another peer reviewed paper that the IPCC must ignore in the AR 5 if the overconfident conclusions of the other 4 reports are not going to be shown to be just plain wrong? After all, new science can be included in the AR 5 up to March 2013.

  19. richard telford says:

    It is a great shame that they did not test if their methods had any statistical power. This would have been easy to do using GCM output and would have greatly strengthened their paper. Without knowing what their type II error rate is, it is impossible to evaluate the paper properly.

  20. D Böehm says:

    Whenever I look at this chart, I question whether human emissions can have the global warming effect claimed by the climate alarmist crowd. There is obviously much we do not know, including all the sources and sinks of atmospheric CO2.

    As this paper makes clear, AGW has never been empirically measured. If it exists, it is simply too small to be measurable. At current concentrations, CO2 is only a tiny, bit player; a minuscule 3rd order forcing that is easily swamped by second- and first-order forcings.

    More CO2 will not have any effect; the radiative response was used up in the first few dozen ppm. Adding more CO2 is just painting the window again. As we now observe, adding another layer of paint has caused no further global warming.

  21. John Mathon says:

    This makes sense to me because I noticed some time ago that a large spike in temps occurred from 1910-1940 yet there was no significant increase in co2 during this period. Clearly there were other phenomenon at work and it was unclear to me why the ipcc was so willing to discount whatever caused that warming. I realized that the hockey stack may have been the major factor affecting their opinion because essentially they were denying that temperatures had varied in the past therefore natural causes of warming could be discounted but since it is clear that the mwp and lia and numerous other ups and downs occurred in the record its apparent that these other forcings aren’t zero, don’t net out to zero and are nontrivial therefore the assumption that the only thing that could have caused the temperature change from 1978-1998 was anthropogenic is not as simple as thought. Judith curry has pointed out that the attribution is not as clear as thought to be. Numerous studies have shown that this attribution is more complex than was contemplated. The temperature since 1997 has been statistically flat whereas co2 has been climbing. 16 years of non-correlation and its clear that the previous assumed direct and clear association was a fluke therefore the basis for that attribution had to be questioned. This paper simply affirms that the attribution is clearly not as simple as was thought just 5 or 10 years ago. The fact that the association is i(2) implies that there is significant lag (at best) in the correlation. Since the 1978-1998 showed no lag it can be assumed that co2 was not a significant cause of that temperature change. It may have been a factor but then it is likely three is no long term effect of co2 and that the contribution of co2 to that rise was a small portion of that change and is temporary.

  22. MarkW says:

    “That’s interesting because, it seems to me to say that radiative equilibrium is a very fast process but I still find it difficult to understand how the CO² effect can similate a ‘burst’ of heat energy which then fades at the same time that CO² concentration is steadily increasing.”

    Could indicate a system with significant negative feedbacks that are delayed in their effects.

  23. Rob Ricket says:

    Do the statements below make sense to anyone? Are the authors inferring there is temporary forcing effect that magically disappears independent of Solar iridescence levels? How should we account for this return to stasis… increased carbon sinking through plant growth?

    “Nevertheless, there seems to be a temporary anthropogenic effect. If the effect is temporary rather than permanent, a doubling, say, of carbon emissions would have no long-run effect on Earth’s temperature, but it would in- crease it temporarily for some decades.”
    “Indeed, the increase in temperature during 1975–1995 and its subsequent stability are in our view related in this way to the acceleration in carbon emissions during the second half of the 20th century.”

  24. Lubos Motl says:

    I feel sort of uncomfortable with the verb “fail” in the title because it suggests that finding a man-made fingerprint would be a “success” while finding it’s not there is a “failure”. In this way, the verb introduces a strange bias or lack of impartiality – either because the writer of the word “fail” is partial himself (not the case here) or because he suggests that the authors of the scientific papers are biased in the same way (I don’t see reasons for this accusations, either).

  25. ‘The perceived statistical relation between temperature and anthropogenic forcings is therefore a spurious regression phenomenon”

    Spurious: Not being what it purports to be; false or fake: “spurious claims” (of a line of reasoning) Apparently but not actually valid: “this spurious reasoning results in nonsense”.
    Synonyms: false – sham – counterfeit – bogus – mock – phony

    Regression: A return to an earlier stage of life or a supposed previous life, esp. through hypnosis or mental illness, or as a means of escaping.

    Phenomenon: A fact or situation that is observed to exist or happen, esp. one whose cause is in question.

    This paper hits the nail on the head.

  26. pdtillman says:

    Re: David Stockwell, http://landshape.org/enm/agw-doesnt-cointegrate-beenstocks-challenging-analysis-published/

    Stockwell thinks McKitrick was one of their peer-reviewers. Plus,he’s no statistical slouch himself. I hope Wm. Briggs takes a look. And McIntyyre too, of course. Interesting times.

  27. Steveta_uk says:

    I think Lubos is perhaps reading too much into the use of the word “fail” – perhaps a cultural difference is involved.

    If I try to find something, but fail, this says nothing about whether I wanted to find it or not. I might be trying to find some lost money (failure=bad) or trying to find lumps in tender parts of my body (failure=good).

  28. Matthew R Marler says:

    I am glad that they finally got it published, even though it is an online journal. They originally intended it for Nature (judging from an annotation at their website when the first draft was posted.)

  29. Michael Moon says:

    Amazing. “Correlation is not causation,” anyone? These guys have stuck their necks out far, far. Will they ever get funded again? Will whoever reviewed this paper ever get funded again? Will the Red Queens of Climatology scream “Off with their heads!!”? Will Michael Mann’s head spontaneously combust? Will Hansen chain himself to these guys’ doors?

  30. Kevin Kilty says:

    Perhaps someone better grounded in these non-stationary tests can confirm what I interpret. The direct effect of CO2 is logarithmic so we should not expect that this effect is I(1); but including feedbacks from water vapor the effect can appear linear over small temperature ranges and thus appear I(1). Yet, the authors separate water vapor as a separate forcing…why then any statistical impact of CO2 being I(2) and temperature being I(1); when it is the log of CO2 that is important?

    Is it because a logarithimic non-stationarity should appear sub-I(1)? It certainly would grow more slowly than linear. Am I seeing this correctly?

  31. BrianJay says:

    Interesting but they are Jews living Israel. Guess what the first line of attack will be

  32. Stephen Wilde says:

    If the increase in CO2 did have an effect with a delayed negative response then the shift in the global air circulation to a more zonal pattern and probably also more poleward climate zones and jet stream tracks (as was actually observed) would be a plausible solution because that would alter the rate of energy loss to space.

    I don’t put it down to anthropogenic causes at all. They would be barely discernible.

    Instead, more solar energy entered the oceans which altered the CO2 absorption / release balance driving atmospheric CO2 up and then the circulation pattern adjusted.

    However even that would be a trivial effect swamped by the faster water cycle which resulted from the solar forcing processes that I have described elsewhere.

    A natural air circulation response to solar variability has been in control all along subject only to modulation by internal oceanic variability.

    Anything else is miniscule in comparison.

    And we are now in a more meridional air circulation regime which suggests that the whole process has gone into reverse notwithstanding increasing CO2 emissions.

  33. BrianJay says:

    Moderators Spelling mistake above “br” should be “be”.

    [Please be more specific. — mod.]

    [Reply: Fixed. previous entry with Israel. -ModE]

  34. mpainter says:

    Rob Ricket says: January 3, 2013 at 9:55 am

    Do the statements below make sense to anyone? Are the authors inferring there is temporary forcing effect that magically disappears independent of Solar iridescence levels? How should we account for this return to stasis… increased carbon sinking through plant growth?

    “Nevertheless, there seems to be a temporary anthropogenic effect. If the effect is temporary rather than permanent, a doubling, say, of carbon emissions would have no long-run effect on Earth’s temperature, but it would in- crease it temporarily for some decades.”
    “Indeed, the increase in temperature during 1975–1995 and its subsequent stability are in our view related in this way to the acceleration in carbon emissions during the second half of the 20th century.”
    ================================
    You have put your finger on the nub.
    “a temporary anthropogenic effect” concerning CO2 is a new idea to me. I see nothing more in this paper than a matching of the temperature record against CO2 emissions and observing that there is no correlation except in the 1975-95 trend, to which they baldly attribute a “temporary” effect. This is straining out gnats and swallowing the strainer. All of this heaved up in a fine flourish of improved statistical techniques.

  35. davidmhoffer says:

    Stephen Richards;
    That’s interesting because, it seems to me to say that radiative equilibrium is a very fast process but I still find it difficult to understand how the CO² effect can similate a ‘burst’ of heat energy which then fades at the same time that CO² concentration is steadily increasing.
    >>>>>>>>>>>>>>

    That’s not my interpretation. What they are doing is heavy duty nasty statistics to classify different ripples on the pond. A rock thrown into the pond creates a set of ripples that die out, but then the pond level is pretty much unchanged. But an instantaneous increase in stream flow coming in, even a small one, would also create ripples across the pond surface, but when the ripples die out, the pond level would indeed be higher.

    So they’ve classified CO2 as a rock thrown into the pond. Perturbs the system as it is added, but makes very little long term difference once the ripples die out, which fits the observational evidence rather nicely.

    On the other hand, they’re trying to identify ripples from half a dozen different things all at once, and I’m not sure they actually have enough data to do that. I can stand on the shore of my favourite lake and tell you if a ripple on the surface came from a kid doing a cannon ball off the dock or a passing power boat. But I can’t look at the surface as a power boat blows by, a kid does a cannon ball off the dock, the dam at the far end gets raised, it starts to rain, wind changes direction and a sea gull takes a dump all at the same time and tell you which ripples belong to what.

    Looking forward to commentary by rgbatduke, Leif, and SteveM….

  36. Gail Combs says:

    Stephen Richards says:
    January 3, 2013 at 9:17 am

    Panel c clearly shows that there is a positive relationship between temperature and the change in the anthropogenic anomaly once the warming effect of solar irradiance is taken into consideration

    That’s interesting because, it seems to me to say that radiative equilibrium is a very fast process but I still find it difficult to understand how the CO² effect can similate a ‘burst’ of heat energy which then fades at the same time that CO² concentration is steadily increasing.
    >>>>>>>>>>>>>>>>>>>>
    I am not sure what the term in statistics is, but A may be correlated with B and B is what actually drives C and not A.

    In this instance we know there was a lot of inventing going on in the 1800′s but at least in the USA on farms was still where people were.
    1790 Farmers made up about 90% of labor force, population 5,308,483 (1800)
    by 1850, Farmers made up 64% of labor force, population 23,191,786
    by 1900 Farmers made up 38% of labor force, population 75,994,266
    by 1920 Farmers made up 27% of labor force, population 105,710,620
    by 1940 Farmers made up 18% of labor force, population 131,820.000
    by 1960 Farmers made up 8.3% of labor force, population 180,007,000
    by 1980 Farmers made up 3.4% of labor force, population 227,020,000
    by 1990 Farmers made up 2.6% of labor force, population 246,081,000

    This means not only did the population increase but the population became concentrated in town and cities where there were jobs. The U.S. in 1800 had a per-capita energy consumption of about 90 million Btu. In 1949, U.S. energy use per person stood at 215 million Btu. and now it is 335.9 million BTUs.

    Therefore what I think you are seeing is the increase in temperature linked to CO2 was from humans became more ‘concentrated’ in a location and used more energy. This was from the local UHI effect because thermometers were going from rural to city and finally to airports. ( the Climate Scientists classify Airports as ‘rural’) For the last decade there just is no other fiddling available to make the temperatures increase. Especially after the station drop out. link and the oceans are not cooperating.

    AIRPORTS:
    Digging in the clay: Location Location Location

    CHIEFIO: More Airports Hotter Than ‘nearby’ Stations

  37. aaron says:

    Lubos, that’s common language for these types tests to simply indicate sign.

  38. tgmccoy says:

    Can someone smarter than me tell why THIS: http://weather.unisys.com/surface/sfc_daily.php?plot=ssa&inv=0&t=cur isn’t showing cold oceans? Clearly to me-
    we seem to have a No Nino conditon….

  39. lsvalgaard says:

    Stephen Richards says:
    January 3, 2013 at 9:17 am
    Panel c clearly shows that there is a positive relationship between temperature and the change in the anthropogenic anomaly once the warming effect of solar irradiance is taken into consideration
    As long as they use an outdated [really Lean 2005] solar irradiance reconstruction, they can’t remove it in any meaningful way.

  40. Stephen Wilde says:

    ” A rock thrown into the pond creates a set of ripples that die out, but then the pond level is pretty much unchanged. But an instantaneous increase in stream flow coming in, even a small one, would also create ripples across the pond surface, but when the ripples die out, the pond level would indeed be higher.”

    That is a neat summary of the position that I have been trying to get across for some time.

    Increases in atmospheric mass, the strength of the gravitational field or the level of insolation would raise the ‘pond’ level because they change the amount of energy that the system can hold.

    Everything else including changes in radiative characteristics just create ripples that die away because they just serve to redistribute energy rather than adding anything to the total energy available.

    It is a matter of variable flow rates for multiple components in a single system whereby timing is everything and at base the time that matters is the time it takes for energy in to escape the atmosphere once it has arrived and that time is a function only of mass and gravity with insolation supplying the flow of energy that mass and gravity interferes with.

    As far as I know the science relating to gas clouds in space, suns and their formation and planetary gas giants gives no regard to radiative characteristics in determining internal temperatures. It is all a matter of mass and gravity. So it should also be for atmospheres around lumps of rock.

    If this paper brings that scenario to the fore it can only be to the good.

  41. Tony McGough says:

    I don’t understand any of the technicalities of the Israeli paper, but gather that it does not consider any physical effects at all – just examines correlations (and lack of correlations) between the seven or eight datasets they acquire (from sound sources). They then conclude that man-made sources have little or no effect on temperatures, provided that they have not found an anomalous false negative.

    It’s nice to have these apparently cool-headed statisticians saying what one can simplistically discern from a simple inspection of the CET (central England temperature) record – temperature rises ain’t nothing to do with us, guv.

    I would like some of our learned regular contributors to give the paper a shake-down, please: I have learned to appreciate their analyses.

  42. rgbatduke says:

    Good paper. Not sufficient, but a powerful argument. My one concern would be the same one I often use in the other direction — if the Earth’s instantaneous climate state is viewed as a point that is a multidimensional orbit that in the very loosest of terms is “around” some center of quasi-stability — not a stationary Poincare attractor but perhaps a set of attractors in a rugged landscape — then one has to make certain assumptions in order to do the timeseries analysis they suggest. I’m not certain those assumptions are satisfied.

    The key question is indeed the one about residuals of first and second order differences, but those have a physical interpretation as being components of a gradient, a partial differential term in a model, or components of a higher order partial derivative.

    There are many statistical models that average to zero along any given axis so that there is no single variable linear trend but that have profound multivariate trends. The classic example is the “exclusive or” distribution, a distribution where A exclusive or B has weight 1, while A and B or not A and not B have weight 0. If you look at the distribution along the A axis (alone) it is uniform, and A looks like it is not a predictor. If you look at B alone ditto. Yet from a knowledge of A and B one can predict the outcome perfectly.

    This is only the simplest version of this difficulty (non-separability) involving two binary variables, but in statistical modeling it is profound and pervasive. It isn’t clear from their discussion whether or not they assumed separability. If not, the best that they could conclude from their result is that they do not find evidence of a separable (unconditional) contribution from CO_2 compared to e.g. insolation, not that at may not be an important causal factor. I also find it difficult to physically justify CO_2 as producing a local effect that is then neutralized over decades.

    So, interesting, powerful argument well made, but not a home run. It does open the way for the future, though.

    It also leaves open the question of “which solar data”. Once again, we await Lief, who will remind us that even the I(1) result for insolation depends on which proxy reconstruction of insolation one uses. Back in 1880 they weren’t doing electronics so much. I don’t know if any clever lad or lass used e.g. photographic film to infer solar intensity over the 30+ years before e.g. the photoelectric effect and invention of tubes permitted some sort of direct electron measure, and as we’ve been told repeatedly, sunspot counts are a remarkably inconsistent proxy that it is very difficult to retroactively fix in the process of reconstructing solar state over even/only 1.5 centuries.

    So it could be that none of the things studied have a first order effect on the climate, it is basically in some sort of random walk tilted primarily by things yet unstudied. Shades of Koutsoyiannis! Those things could all be driving the climate vigorously back and forth across some quasi-equilibrium as second order stuff, while the primary driver is quietly being ignored.

    rgb

    rgb

  43. DirkH says:

    richard telford says:
    January 3, 2013 at 9:47 am
    “It is a great shame that they did not test if their methods had any statistical power. This would have been easy to do using GCM output and would have greatly strengthened their paper.”

    Thanks, that made me laugh.
    Richard, have the computer kiddies in the modeling departments learned how to model convective fronts in the meantime?

    Cloud formation?

    The QBO?

    Oh. I thought so.

  44. tarpon says:

    apparently no one reads or knows physics. If they did they would know the heat trapping ability if co2 is logarithmic. It’s not possible for it to do what the warmists are trying to convince people it can.

  45. Gail Combs says:

    BrianJay says:
    January 3, 2013 at 10:37 am

    Interesting but they are Jews living Israel. Guess what the first line of attack will be.
    >>>>>>>>>>>>>>>>>>>>>>>>>>>>
    That does not work because it is “Politically Incorrect” to use race. If they try to connect to the banks it again does not work because of the World Bank (and Robert Watson) are completely entangled with the IPCC and CAGW including the World Banks last Report

  46. DeWitt Payne says:

    Not this again.

    This is a dud, not a bombshell. Cointegration tests were designed by economists to rule out spurious correlations between things when there is no known mechanism to relate them . We know that there must be a correlation between surface temperature and ghg concentration. (We do, in fact, know this. Clear sky radiative transfer is considered a solved problem in Physics. See scienceofdoom.com and http://people.su.se/~rcaba/teaching/PhysMetLectNotes.pdf for example.) What we don’t know is the magnitude.

    Testing for a trend without detrending the data first reduces the statistical power of the test. Goodness-of-fit calculations are done on the residuals after fitting, not the raw data. There will be more false negatives if you don’t detrend, especially when there may be low frequency oscillations like the AMO influencing the measured temperature (which will also make the raw data look like unit roots are present). There’s a good discussion of this in respect of the C. Franzke paper at The Blackboard.

  47. Paul Westhaver says:

    Would someone send a copy of this paper to that mouth breathing thug, Seth Borenstein at the AP?

    With some hope, and some education aids, (blocks, counting sticks, staking rings) he may be able to comprehend it. That goes for Martin Mittelstaedt, no longer doing chicken-little-the-end-of-the-world-is-coming articles for the Globe and Mail.

  48. Dr. Acula says:

    Being well-versed in the Austrian School of economics, I have a pretty low opinion of econometric techniques.

    Sorry, but this paper seems to be playing mathematical games to me. It’s not at all obvious why “cointegration tests” should be trusted. What empirical evidence is there to justify using cointegration tests? Why were certain tests used and not their alternatives?

    I’m guessing this isn’t really science, but rather the opinion of (perhaps seasoned) econometricists engaging in their art.

    It did not take me long to find troubling information about cointegration: http://www.capco.com/capco-institute/capco-journal/journal-32-applied-finance/the-failure-of-financial-econometrics-asses

    “This paper demonstrates that the results obtained by using different cointegration tests vary considerably and that they are not robust with respect to model specification. It is also demonstrated that, contrary to what is claimed, cointegration analysis does not allow distinction between spurious relations and genuine ones. Some of the pillars of cointegration analysis are not supported by the results presented in this study.”

  49. Kasuha says:

    Wow, just wow. Not only they exclude anthropogenic impact on temperature, they even provide indirect proof that feedbacks are negative and strong enough to compensate it.
    This would be a great reference to add to the upcoming IPCC report, wouldn’t it? I believe it would be no problem to add it if it already references papers which were not even published yet…

  50. DeWitt Payne says:

    rgbatduke,

    Charles Greeley Abbot was the first to accurately estimate the value of the solar constant early in the twentieth century using a pyrheliometer mounted on a balloon at 25km altitude. He attempted for something like forty years to measure the variability with little success as his measurements had to be made from the surface. So accurate and precise direct measurement of the solar constant had to wait for satellites. Any inference of the variability of the solar constant using proxies is subject to all the problems of proxy measures in general.

  51. Billy Ruff'n says:

    This should give Steve McIntyre something to do when he gets home from Asia. It will be interesting to see what his take is on the statistical methods employed.

  52. Bob says:

    Anthony, do what you can to keep this paper from Mosher – he might just wax in-eloquently.

  53. rgbatduke says:

    One of many problems I have with reconstructions of things like irradiance and temperature is that they never seem to come with an error analysis. The curves above (wherever they are from) are no exception — we see a simple jiggy line that is supposed to be “temperature”, or “irradiance”, or “CO_2 level” back to 1880, with no error bars at all.

    Yet those error bars have to exist, and have to be rather large for anything inferred by means of proxies or measurements with crude instrumentation at the beginning of the 20th century or earlier).

    It would actually be really lovely to have honest error bars — with any reasonable interpretation of error — both in the figures and, one would expect, in use in the statistical analysis of the timeseries. Otherwise one has a compounding of assumptions and errors.

    Lacking both error bars and a single set of solar data that the entire community endorses within those error bars (so that the error bars reflect among other things disagreement within that community) it is going to be impossible to create a statistical study of solar state and global climate that means anything at all. This (lack of a) model then becomes an important Bayesian prior in further statistical analysis of possible causes, because the amount of warming one attributes to CO_2 clearly has to depend to some extent on how much you attribute to insolation, so if the latter is uncertain the former is even more uncertain (and vice versa). Errors tend to grow like SE = \sqrt(SE_1^ + SE_2^2) after all, and that’s for simple linear models with favorable assumptions — it can be much worse.

    Lief, is there a set of solar data that everybody in the solar community endorses (or at least, has their disagreements duly entered in in the form of additional uncertainty in the claims)? Or is it really just a matter of flipping coins and grabbing a paper at random? Picking the paper (either way) that produces the conclusions you want to assert?

    It would be very useful to see this computation redone not for the 1880-2012 data, but only for the e.g. 1979-2012 data that is moderately reliable. The idea is actually a good one and I’ve looked at it myself — the problem can be reduced to looking at global temperature and the e.g. Mauna Loa CO_2 curve and comparing them — there are obviously lots of ways of mapping the one monotonic function (CO_2) into a linear (over this timescale) model for temperature plus noise, but the noise is them many times larger than the trend, the fit interval is short, and one ignores completely the dynamics of the noise. It is clear at a glance, however, that there is no short-run correlation between changes in temperature and CO_2 level — only the weak trend over the entire timeseries, which puts 2012 as the 9th warmest year in 33, remarkably close to both mean and median. I don’t need to do student t to measure p to tell you that p for the null hypothesis is not going to be reassuringly low over the entire interval.

    They should also attempt a nonstationary timeseries analysis, treating the temperature like a Hurst-Kolmogorov variable with a possibly directed stochastic noise term wrt to eh hypothesized drivers.

    rgb

  54. A.D. Everard says:

    Very thorough, a brilliant paper.

  55. Resourceguy says:

    It’s about time someone stepped up to do this kind of unit root test. There are thousands of such studies published all the time on other topics.

  56. Stephen Richards says:

    richard telford says:

    January 3, 2013 at 9:47 am
    It is a great shame that they did not test if their methods had any statistical power. This would have been easy to do using GCM output and would have greatly strengthened their paper

    You clown !!!

  57. Resourceguy says:

    To all the negative commentators above, I will remind you that ALL of the top research departments of the world’s central banks use this methodology and result format.

  58. lsvalgaard says:

    rgbatduke says:
    January 3, 2013 at 12:32 pm
    Leif, is there a set of solar data that everybody in the solar community endorses (or at least, has their disagreements duly entered in in the form of additional uncertainty in the claims)? Or is it really just a matter of flipping coins and grabbing a paper at random? Picking the paper (either way) that produces the conclusions you want to assert?
    There is no such set(s). I am leading two workshops [involving the foremost researchers in the solar community] to produce such sets:
    1) http://www.leif.org/research/Svalgaard_ISSI_Proposal_Base.pdf
    2) http://ssnworkshop.wikia.com/wiki/Home
    Our work is not finished yet, although we do have some preliminary findings [basically what I have been talking about for some time here on WUWT].
    Amazingly, there is some resistance among our ‘users’ to our attempt to create a vetted, agreed upon data set. It seems to be most convenient to have several [and wrong] sets to pick from to support everyone’s pet conclusions.

  59. Gunga Din says:

    This is a most interesting paper, and potentially a bombshell, because they have taken virtually all of the significant observational datasets (including GISS and BEST) along with solar irradiance from Lean and Rind, and CO2, CH4, N2O, aerosols, and even water vapor data and put them all to statistical tests (including Lucia’s favorite, the unit root test) against forcing equations. Amazingly, it seems that they have almost entirely ruled out anthropogenic forcing in the observational data, but allowing for the possibility they could be wrong, say:

    “…our rejection of AGW is not absolute; it might be a false positive, and we cannot rule out the possibility that recent global warming has an anthropogenic footprint. However, this possibility is very small, and is not statistically significant at conventional levels.”

    ==============================================================================
    I remind that I’m a layman here, but I wonder what they would have concluded if Watts et al had been included?

  60. EternalOptimist says:

    We are pretty sure. But we are not 100% sure
    and if you want to prove us wrong – heres how

    how long have I waited to hear this. And, guys, I dont even care if you are wrong, just that admission that you are not walking on water is fantastic. reality at last

  61. Other_Andy says:

    Resourceguy says:

    “To all the negative commentators above, I will remind you that ALL of the top research departments of the world’s central banks use this methodology and result format.”

    With the ‘banking collapse’ a few years ago and the state of the world economy at the moment that should at least give you a pause for thought.

  62. DeWitt Payne says:

    Resourceguy,

    I will remind you that ALL of the top research departments of the world’s central banks use this methodology and result format.

    And looking at the global economy, I would say it’s working really well. /sarc

    Just one fundamental flaw of many. Atmospheric CO2 concentration is not a random variable. It is almost completely deterministic. There is measurement error and year to year variability, but those factors are small compared to the deterministic change. We know where it comes from and how much is emitted each year. Applying a unit root test to this data without removing the deterministic trend is therefore invalid.

  63. Climate Ace says:

    Did they forget to polynomially cointegrate ocean heat?

  64. richardscourtney says:

    DeWitt Payne:

    At January 3, 2013 at 2:15 pm you assert

    Atmospheric CO2 concentration is not a random variable. It is almost completely deterministic. There is measurement error and year to year variability, but those factors are small compared to the deterministic change. We know where it comes from and how much is emitted each year. Applying a unit root test to this data without removing the deterministic trend is therefore invalid.

    You “know where it comes from”? Really? How?

    I don’t know if the cause of the recent rise in atmospheric CO2 concentration is entirely anthropogenic, or entirely natural, or a combination of anthropogenic and natural causes. But I want to know.

    At present it is not possible to know the cause of the recent rise in atmospheric CO2 concentration, and people who think they “know” the cause are mistaken because at present the available data can be modeled as being entirely caused by each of a variety of causes both anthropogenic and natural.
    (ref. Rorsch A, Courtney RS & Thoenes D, ‘The Interaction of Climate Change and the Carbon Dioxide Cycle’ E&E v16no2 (2005) ).

    The econometrics paper under discussion may or may not be found to contain many flaws (time will tell) but it displays a refreshing willingness to admit we don’t “know” anything about the climate issue with certainty.

    Richard

  65. DR says:

    SORCE Science Meeting, 18-19th September, 2012

    http://lasp.colorado.edu/sorce/news/2012ScienceMeeting/docs/presentations/S2-01_Ineson_sorce2012.pdf

    SIM measured a decline in ultraviolet from 2004-2007 that is a factor of 4 to 6 times larger than typical previous estimates

  66. rollsthepaul says:

    Could all of this, have only achieved a firm grasp of the obvious?

  67. Alan Millar says:

    Well this is good look at things from the statistical angle. The technique is robust but, as we all know, the period of reliable data is far too short to draw any conclusions.

    It is not proof of anything but adds to the debate and does prove the uncertainty that is present in clmate science at the moment.

    The result could have been predicted just by eyeballing the various data sets actually. There is clearly not a good correlation between CO2, temperature, aerosols and black carbon since 1880.

    However, it is always useful to have this backed up by robust methodology.

    Alan

  68. DeWitt Payne says:

    richardscourtney,

    E&E and you’re a co-author? Pull the other one.

    As long as we’re self-referencing: http://noconsensus.wordpress.com/2010/03/04/where-has-all-the-carbon-gone/

  69. R Taylor says:

    Anthony:

    Andre Bijerk and Joanna Ballard might be two of the best, but perhaps credit for bring Beenstock, et al. to WUWT-land should go to the following contributor to Tips and Notes:

    Brian H says:
    December 9, 2012 at 10:19 pm

    h/t DirkH;
    http://economics.huji.ac.il/facultye/beenstock/Nature_Paper091209.pdf

    This means,
    crucially, that a doubling of greenhouse gas forcings does not permanently increase
    global temperature.

    From there, it was easy to track down the published paper, also as noted soon thereafter in Tips and Notes.

  70. Anthony Watts says:

    @R Taylor

    With me getting a veritable firehose of comments and email each day sometimes it simply is a matter of who gets my attention first. I regret I cannot read every comment and email I get.

  71. DirkH says:
    January 3, 2013 at 12:06 pm
    richard telford says:
    January 3, 2013 at 9:47 am
    “It is a great shame that they did not test if their methods had any statistical power. This would have been easy to do using GCM output and would have greatly strengthened their paper.”

    Thanks, that made me laugh.
    Richard, have the computer kiddies in the modeling departments learned how to model convective fronts in the meantime?
    ———————–
    I am glad I amused you, but it how realistically the CGMs model climate is irrelevant for the type of analysis I am proposing. What is relevant is that there is a time series of global temperatures and a time series CO2 and other forcing that generated this temperature series in the model. If Beenstock et al’s method cannot find the relationship between CO2 and temperature in the model, then it cannot be trusted if it cannot find the relationship in the real world.

  72. DeWitt Payne says:

    And then there’s Ferdinand Englebeen’s page: http://www.ferdinand-engelbeen.be/klimaat/co2_measurements.html#The_mass_balance
    Also, The Carbon Cycle, T.M.L. Wigley and D.S. Schimel ed., Cambridge University Press, 2000.

    The evidence, as opposed to speculative assertions, is overwhelming that humans have caused the atmospheric CO2 level to increase from the preindustrial level of 280 ppmv.

  73. D Böehm says:

    DeWitt Payne,

    True. Human activity has added [harmless, beneficial] CO2 to the atmosphere. It is still a very tiny trace gas, and it will never be a significant part of the atmosphere.

    And yet, the planet has been up to 8ºC warmer repeatedly in the geologic past, without regard to CO2 levels. We are currently in a geologically cool period [top of the chart]. Ferdinand Engelbeen has also stated that the rise in CO2 is harmless. Thus, there is no need whatever to reduce CO2 to pre-industrial levels.

    On net balance, more CO2 is better for the biosphere, and there is no verifiable, scientifically testable global harm as a result of the rise in CO2. Really, it’s all good.

  74. Mervyn says:

    How many more such studies will it take before the United Nations instructs the IPCC to abandon its quest of proving anthropogenic global warming and to concentrate on natural variability and theories such as Henrik Svensmark’s cloud theory, which is backed by observational data and b experimentation?

  75. crosspatch says:

    The evidence, as opposed to speculative assertions, is overwhelming that humans have caused the atmospheric CO2 level to increase from the preindustrial level of 280 ppmv.

    I don’t think anyone is arguing that humans have or haven’t added CO2 to the atmosphere. The argument is over how sensitive the climate is to it. The IPCC climate sensitivity numbers are basically speculative and observations over time have shown them to be false. We get about 1C of warming for each doubling of CO2 in the atmosphere. Most of that warming from pre-industrial levels has already happened. CO2 impact is logarithmic and each ton of CO2 added to the atmosphere has LESS impact than the ton before it had. To get 1C of warming from today’s level, we would have to double atmospheric CO2 from today’s level. The “feedbacks” that the IPCC has speculated to exist haven’t turned up in real life. We are spending hundreds of billions and fleecing the populations of this planet based on predictions of a fairytale.

    Wake me up when we get to 1600ppm of CO2 … but we will never make it that far. China currently has 30 nuclear power plants under construction in various phases of completion. US CO2 emissions are falling, China’s emissions will begin to fall in about another 10 years as more of their nuclear generation comes online, Excess CO2 above Earth’s equilibrium amount begins to come out of the atmosphere as soon as the emissions are stopped. I doubt we will even get to 800 ppm, let alone 1600. But more importantly, nobody has shown a good reason to even reduce CO2 emissions. Why should we? Nobody has shown what PORTION of warming since the end of the LIA is due to CO2 to my satisfaction. They are simply running around talking about what “could” happen in the future based purely on climate models programmed with speculative feedbacks to CO2 increases.

    It’s theft. It’s a racket. It is robbery.

  76. lsvalgaard says:

    Mervyn says:
    January 3, 2013 at 5:22 pm
    concentrate on natural variability and theories such as Henrik Svensmark’s cloud theory, which is backed by observational data
    Actually it is not: http://www.leif.org/EOS/swsc120049-GCR-Clouds,pdf
    “it is clear that there is no robust evidence of a widespread link between the cosmic ray flux and clouds” and
    “In this paper we have examined the evidence of a CR-cloud relationship from direct and indirect observations of cloud recorded from satellite- and ground-based measurement techniques. Overall, the current satellite cloud datasets do not provide evidence supporting the existence of a solar-cloud link. Although some positive evidence exists in ground-based studies, these are all from highly localized data and are suggested to operate via global electric circuit based mechanisms: the effects of which may depend on numerous factors and vary greatly from one location to the next. Consequently, it is unclear what the overall implications of these localized findings are. By virtue of a lack of strong evidence detected from the numerous satellite- and ground-based studies, it is clear that if a solar-cloud link exists the effects are likely to be low amplitude and could not have contributed appreciably to recent [anthropogenic] climate changes.”

  77. lsvalgaard says:

    Mervyn says:
    January 3, 2013 at 5:22 pm
    concentrate on natural variability and theories such as Henrik Svensmark’s cloud theory, which is backed by observational data
    Actually it is not: http://www.leif.org/EOS/swsc120049-GCR-Clouds.pdf

  78. Adam says:

    The paper is from Israel. So following the logic of the alarmists anybody who disagrees with it is a Holocaust denying anti Semite who loves Hitler.

  79. willb says:

    DeWitt Payne says:
    January 3, 2013 at 4:47 pm

    “And then there’s Ferdinand Englebeen’s page:”

    I don’t know what Ferdinand Englebeen is smoking, but the Mass Balance argument for attributing the increase in atmospheric carbon dioxide to anthropogenic causes is anything but overwhelming. The argument seems to be based on the premise that the natural CO2 fluxes into and out of the atmosphere remain unchanged regardless of atmospheric CO2 concentration. IMHO this is not particularly good logic. It also violates Le Chatelier’s principle.

    According to the carbon cycle theory, there are natural carbon fluxes into and out of the atmosphere that are on-going and continuous. The argument for Mass Balance goes something like this: When humans add X amount of CO2 to the atmosphere in any given year, these natural fluxes adjust themselves such that X/2 of the added amount is removed and X/2 remains (forever, or at least for a very long time). Because the amount of additional CO2 is less than the amount added by humans, Mass Balance says the increase must be due solely to anthropogenic CO2.

    Suppose in the following year there is no anthropogenic CO2 added to the atmosphere. What happens to the X/2 quantity of anthropogenic CO2 from the previous year that is still in the atmosphere? According to Mass Balance, this added CO2 remains as a permanent increase to the atmospheric CO2 concentration. What happens to the natural fluxes? They presumably stay in balance, as they were in the year before the anthropogenic CO2 addition. Mass Balance therefore seems to be saying the fluxes into and out of the atmosphere will remain the same as they were two years ago, before anthropogenic CO2 was added. This despite the fact that the atmospheric CO2 concentration has increased.

    So where is the equilibrium shift that, according to Le Chatelier’s principle, counteracts this increase in concentration? Or does Le Chatelier’s principle not apply in this case?

  80. John Mason says:

    The obvious gets a paper. Of course their ‘might’ have been a ‘temporary’ effect when co2 rise and temp rise happened to correspond. So, their might be a ‘temporary’

    Any common sense observer has seen that our rise in temps in the later part of the 20th century continues a non-remarkable trend since the end of the little ice age.

    I expect many more papers like this to give some face saving backing away from the prior positions of dangerous AGW. I still find it silly this couching of even a paper like this that says, this doesn’t mean there isn’t a temporary AGW effect – but rather just saying the models have not been rigorously tested with statistics.

  81. David Jay says:

    Wait, can guys from Israel be “deniers”?

    I get so confused…

  82. Rob Ricket says:

    Friends, I’m just a layman with a limited understanding of advanced statistics, but it seems that the biggest bone of contention lies with the claim that carbon forcing has a limited shelf life. Does this not essentially affirm the warmest position regarding the warming effect of GHG’s?

    If the findings are valid, then forcing will continue for forty years after GHG’s stabilize. Alternatively, how can it be claimed on one hand that forcing occurs for a limited period, but does not cointegrate with global temp?

    If the statistical methodogy is correct, then the paper essentially proves that one of the data sets is inaccurate. Obviously, the lowest hanging fruit is Mann’s proxy data.

    Red, I’ll take Michal Mann for $250.

  83. Henry Clark says:

    Figure 1 uses untrue temperature data, including falsely depicting 2007 as 0.6 degrees Celsius warmer than 1980 when the actual figure from satellite data was not more than 0.3 degrees warmer. There was not more than half as much warming over that period. Partially correcting the graph improves the relationship of temperature with solar activity, turning the graph of figure 1 into http://s9.postimage.org/yrkytofyn/fixedplotb.jpg (click to enlarge).

    One of the most common fallacies in much of what gets called science these days is style over substance. Like GIGO, superficially formal writing and numbers can be only misleading illusions, falsely impressing viewers yet irrelevant when the basic data and assumptions are off. What happens is a little like the story of the Emperor’s New Clothes: Most people fear seeming unsophisticated and hesitate to criticize such.

    However, one of the most important things I ever learned (in another context) was not just to calculate but to state my assumptions before calculation, recognizing that the internal correctness of the math itself was simply irrelevant unless the starting assumptions were correct.

    Implicit unstated assumptions in this paper include (incorrectly) treating Hansen’s GISS as a trustworthy temperature source. Actually such is utterly compromised.*

    * (A simple example is http://www.giss.nasa.gov/research/briefs/hansen_07/fig1x.gif versus http://data.giss.nasa.gov/gistemp/graphs_v3/Fig.D.gif , where the former shows shows the 5-year mean of U.S. temperature in the high point of the 1980s was 0.4 degrees Celsius cooler than such in the 1930s but the latter is fudged to make the same less than 0.1 degrees Celsius apart).

    Get a statistics expert to analyze the paper in a conventional manner, and the illusion of a sophisticated criticism can be increased. But unless they are intelligent enough and sufficiently far from typical naivety (or bias) to criticize starting assumptions and input data rather than treating them as a given, such would be inferior to even my quick casual comments here. (Peer review can be junk due to narrow-minded analysis; an example in another context is how a paper was published and applauded for claiming a 40% decline in phytoplankton over the past several decades, when so much as a look at fish catches, let alone contradictions from other plankton measurements, would show such to be BS, a bit like Mann’s hockey stick was not properly flagged for blatant contradiction to about everything of relevance published before the era of politicized science).

    With that said, despite an incorrect sun versus temperature depiction, this paper happens to be correct on lack of correlation of temperature with CO2, though that can be seen in other ways more blatantly, like http://wattsupwiththat.com/2012/04/11/does-co2-correlate-with-temperature-history-a-look-at-multiple-timescales-in-the-context-of-the-shakun-et-al-paper/

    The paper may be helpful in a way. Much of the CAGW movement’s more naive follower population is comprised of people who utterly fall for superficial appeals to authority, so perhaps the contradiction with CAGW claims could help disrupt their mindsets. But it is style over substance, especially in the context of how multiple climate forcings combine (especially solar / GCR variation like http://s13.postimage.org/ka0rmuwgn/gcrclouds.gif and http://s10.postimage.org/l9gokvp09/composite.jpg though with El Nino echos of past ocean heat back to the atmosphere).

  84. policycritic says:

    @davidmhoffer says:
    January 3, 2013 at 10:50 am
    ________________

    Nice analogy

  85. DeWitt Payne says:

    willb,

    The argument for Mass Balance goes something like this: When humans add X amount of CO2 to the atmosphere in any given year, these natural fluxes adjust themselves such that X/2 of the added amount is removed and X/2 remains (forever, or at least for a very long time). Because the amount of additional CO2 is less than the amount added by humans, Mass Balance says the increase must be due solely to anthropogenic CO2.

    That’s not how it works. I suggest you read my article from The Air Vent to see how the Bern model works in practice. For example, the amount that remains in the atmosphere isn’t half. That’s the apparent value because anthropogenic emission is continually increasing. If human emissions were to cease instantly, the atmospheric CO2 concentration would decay to a level that would be the preindustrial level plus about 15% of the total amount emitted. The initial decay would be rapid, but it would take hundreds to thousands of years to reach a new steady state because the full mixing of the ocean takes that long. This graph is about what the CO2 level would have done if all human emissions ceased in 2005. The new steady state value would be about 320 ppmv. And you’re neglecting the isotope ratio and oxygen concentration data that is in agreement with the source of the increase being mostly fossil fuel combustion.

  86. DeWitt Payne says:

    That should be 22% not 15% so about 335 ppmv, not 320 ppmv, at steady state.

  87. Rob Ricket says:

    Good catch Henry.

  88. Henry Clark says:

    Rob Ricket: Thanks.

  89. Bart says:

    DeWitt Payne says:
    January 3, 2013 at 2:15 pm

    “Atmospheric CO2 concentration is not a random variable. It is almost completely deterministic. There is measurement error and year to year variability, but those factors are small compared to the deterministic change. We know where it comes from and how much is emitted each year.”

    And, we know that part (the anthropogenic input) has negligible impact on the overall concentration. The data show that atmospheric CO2 concentration is almost completely driven by surface temperatures. In this WoodForTrees plot, it is clear that CO2 is dominated by an affine-in-temperature differential equation of the form

    dCO2/dt = k*(T – To)

    where “k” is a coupling constant, “T” is the global temperature anomaly, and “To” is an equilibrium temperature. Here is another such comparison with GISTEMP. Any of the major temperature sets will generally do, as they are all more-or-less affinely related. Any time you have a continuous flow into and out of a system which can be modulated by a particular variable, you can get an integral relationship of this sort for the residual which gets left behind.

    Here is an example of what you get when you integrate the relationship. Clearly, there is very little room for human influence on the measured concentration. It is simply inconsistent with the data. Some quibble that the linear term is an artifact of the choice of “To”, and that provides most of the match in the integrated output. But, there has to be some value of “To”, because the temperature anomaly “T” is measured relative to an arbitrary baseline. More importantly, it has no effect on the slope of the CO2 rate of change, which matches the temperature slope quite well, when you choose “k” to match the variation, the peaks and valleys, between the time series. And, since the anthropogenic input rate itself has a slope, there is, again, no room for it to any level of significance.

    Since differentiation necessarily imparts a 90 degree phase advance, it follows that coincidence between the peaks and valleys in the temperature and the rate of change of CO2 implies that CO2 lags temperature, and therefore the direction of causality is temperature-to-CO2. Or, one may consider that on a more elementary level, it would be absurd to argue that the temperature depends on the rate at which CO2 is changing, and not the overall level and, again, we conclude the direction of causality is temperature-to-CO2.

    It also necessarily follows that the Earth’s mechanisms for sequestering CO2 have been grossly underestimated, the residence time conversely grossly overestimated, and the natural flows into the system grossly underestimated, as well. It is hardly surprising given that such estimates have been largely paper exercises without closed loop confirmation. This is what happens when you guess at an answer, and decide if it is right or not by taking a vote: Fiasco.

  90. Mooloo says:

    If human emissions were to cease instantly, the atmospheric CO2 concentration would decay to a level that would be the preindustrial level plus about 15% of the total amount emitted. The initial decay would be rapid, but it would take hundreds to thousands of years to reach a new steady state because the full mixing of the ocean takes that long.

    So the sinks get rid of most of it quickly, but “know” to leave the other 15% for thousands of years? That is beyond ridiculous.

    If this method was correct, each time a serious volcanic eruption occurred in which there was a significant amount of surplus CO2 over normal then 15% of the excess would fail to be absorbed. Over time, then, CO2 would inexorably rise.

    That “it would take hundreds to thousands of years to reach a new steady state” is a red herring. The excess would be absorbed into quicker sinks and only then slowly equilibrate with the slower ones. That one sink is not yet in equilibrium has no bearing on whether the other ones are mopping up any significant excess.

    The only way the slowness of the oceans will have any effect is if the other sinks are full.

  91. Bart says:

    DeWitt Payne says:
    January 3, 2013 at 8:02 pm

    These falacious arguments have been hammered out numerous times on these boards, often with the participation of Ferdinand. The “mass balance” argument begs the question – it only holds if you a priori assume that the source of the rise observed in the 20th century is attributable to humans. Here is a repeat of a previous response many, many moons ago to others:
    ———————————————————-

    Let

    M = measured concentration
    A = anthropogenic emissions
    N = natural emissions
    U = natural uptake

    We know M = A + N – U. We measure M. We calculate A. From that, we know N-U, and we know that A is approximately twice M, so we know N-U is negative. As you say, it is a net sink.

    But, that’s all we know. We do not know N or U individually.

    The reservoirs expand in response to both natural and anthropogenic emissions. This is the nature of a DYNAMIC SYSTEM.

    Thus, we can take U as composed of two terms:

    UA = natural uptake of anthropogenic emissions
    UN = natural uptake of natural emissions

    So, we only know N-UA-UN. Suppose UA = A. Then M = N – UN, N is greater than UN, and the rise is entirely natural. Equality would never be precisely the case, but it depends on the sequestration time. If that time is arbitrarily small, then it is possible to within an arbitrarily small deviation to have UA = A. We simply do not know. As the sequestration time increases, anthropogenic emissions induce a greater share of the measured concentration. But, we do not know the sequestration time.

    This is a DYNAMIC SYSTEM. It actively responds to changing inputs. You cannot do a static analysis on such a system and expect generally, or even usually, to get the right answer.

    ————————————————

    This was written prior to my discovery (Allan MacRae, who posts here occasionally, has claim to discovering it first, I hasten to say) of the strong and compelling correlation between the rate of CO2 and temperatures. We do, in fact, know more, with this added bit of information. We know that UA is quite close to A, and we know that atmospheric CO2 concentration is largely temperature driven.

  92. davidmhoffer says:

    Bart says:
    January 3, 2013 at 8:35 pm
    >>>>>>>>>>>>>>>>>>

    Wow. Folks, anyone who skipped through Bart’s post because it was long and technical…. I highly recommend going back and looking at those graphs.

  93. Climate Ace says:

    Adam

    The paper is from Israel. So following the logic of the alarmists anybody who disagrees with it is a Holocaust denying anti Semite who loves Hitler.

    You should be disgusted with yourself, belittling the Holocaust and anti-Semitism like you do. The Holocaust is not, repeat NOT, a climate argument toy.

    I am disgusted that the moderator lets comments like that through.

    [Reply: we moderate with a light touch here. Heavier moderation leads to censorship. You have the right to respond to comments you disagree with. — mod.]

  94. Lubos Motl says:

    Dear aaron and Steveta_uk,

    be sure that I’ve noticed that the word “fail” is pretty much the only English word that is being used in these contexts these days. It’s cultural, indeed. But I am still convinced that it affects the listeners’ and readers’ thinking because they inevitably create an emotional association of the possible results with “good” and “evil” or “success” and “failure”.

    So yes, my comment was a recommendation to change the culture and favorite formulations in the English language… Incidentally, there are many contexts in which I think that exactly the opposite “emotional message” is appropriate and desirable. In those cases, I replace the word “fail” by “refuse”. For example, statistical tests refused to find a global warming smoking gun in this case. This sounds like someone wanted these tests to do a dirty job but these tests have some human rights and they just didn’t want to obey. ;-) They refused because there’s ultimately no empirically detectable CO2-caused global warming anywhere, after all.

    Aside from “fail” and “refuse”, there also exist more neutral verbs, obviously.

    All the best
    Lubos

  95. JazzyT says:

    Bart says:
    January 3, 2013 at 8:35 pm

    And, we know that part (the anthropogenic input) has negligible impact on the overall concentration. The data show that atmospheric CO2 concentration is almost completely driven by surface temperatures. In this WoodForTrees plot, it is clear that CO2 is dominated by an affine-in-temperature differential equation of the form

    dCO2/dt = k*(T – To)

    where “k” is a coupling constant, “T” is the global temperature anomaly, and “To” is an equilibrium temperature. Here is another such comparison with GISTEMP.

    The Keeling curve shows a steady rise with a roughly sinusoidal pattern superimposed on it. (It’s available in many places; here is one of them: http://scrippsco2.ucsd.edu/program_history/keeling_curve_lessons_3.html) This sinusoid has a period of one year, and an amplitude of about 5 ppm. The standard interpretation for this is that there is a long-term in CO2 concentrations due to fossil fuel burning, and a series of short-term seasonal variations due to uptake of CO2 during spring and summer, and a release of CO2 from decaying plant matter in the fall and winter. It’s not surprising that these natural processes would be well correlated with temperture, both in terms of the seasons themselves, and also for warmer or cooler years. For these, CO2 shoud follow Northern hemisphere temperature (there’s more land, and more plant life, in the NH) but with a 180 degree phase shift, and then perhaps some lag on top of that.

    Since differentiation necessarily imparts a 90 degree phase advance, it follows that coincidence between the peaks and valleys in the temperature and the rate of change of CO2 implies that CO2 lags temperature, and therefore the direction of causality is temperature-to-CO2. Or, one may consider that on a more elementary level, it would be absurd to argue that the temperature depends on the rate at which CO2 is changing, and not the overall level and, again, we conclude the direction of causality is temperature-to-CO2.

    The derivative of CO2 concentration will mostly follow the seasonal variations, but these are themselves averaged out by a 24-month running average in the graphs that Bart referenced. The 90-degree phase shift will show up (although for some higher frequencies, with period less than one year, there will be a 180 degree phase shift due to the 24-month average). However, these higher frequencies will actually show up as lower frequencies (see Nyquist-Shannon sampling theorem). A glance at the curve shows that these lower frequencies should not cause too much trouble, but it would be a good idea to check to be sure. As for the time domain, going from one point to the next in the series of the derivative of CO2 concentration represents moving the average, i.e., adding a data value a year in the future while dropping one a year in the past. Finally, the first graph uses Hadcrut4sh, which is a Southern hemisphere data set, and so adds a 180 degree phase shift in temperature, for comparison with the CO2 fluctuations, which are predominantly a Northern hemisphere phenomenon.

    It’s not obvious what to make of the resulting comparisons, in terms of fluctuations, phase shifts, etc. What is obvious, however, is that the shorter-term fluctuations that dominate the derivative of CO2 concentration (fluctuations in terms of a few years) should show the effects of similar variations in natural processes. They should also show short-term variations in anthropogenic CO2, if there actually are any. It’s to be expected that natural processes, governed at least partly by temperature, would show up in the derivative of CO2. But anthropogenic CO2 would show up mostly as a steady value (an offset) in the derivative of CO2, if it’s the relatively constant rise that people seem to think that it is, and that the Keeling curve indicates.

    DeWitt Payne says:
    January 3, 2013 at 8:02 pm

    “And you’re neglecting the isotope ratio and oxygen concentration data that is in agreement with the source of the increase being mostly fossil fuel combustion.”

    A lot of people will have a very hard time taking seriously any discussion about atmospheric CO2 concentration if it neglects isotope concentrations. Since plants take up C-12 in preference to C-13, either deforestation or burning of fossil fuels (ancient plants) tends to put more C-12 into the atmosphere, along with less C-13. Measurements of carbon isotopes in atmospheric CO2 show a decreasing concentration of C-13, consistent with the notion that increased CO2 arises from these anthropogenic sources.

  96. Michel says:

    Yes but…
    If a gas mixture that contains CO2 is irradiated by a source at, say, 15°C it will absorb more or less energy in relation with the CO2 concentration. To enable the release of this absorbed energy to the environment (the outer space) the source temperature must change, e.g. increase by approx 0.54°C if the CO2 concentration is doubling. This is primary forcing, just physics.
    As it is undisputed that CO2 concentration went up: where did the additional absorbed energy go if no global temperature change is correlated with it?
    The paper is very interesting because it points to a need for other interpretations of climate change (or no change) than the monomaniac AGW theory.
    As write the authors in their conclusion: This is not an argument about physics, but an argument about data interpretation.

  97. DirkH says:

    richard telford says:
    January 3, 2013 at 4:47 pm
    “I am glad I amused you, but it how realistically the CGMs model climate is irrelevant for the type of analysis I am proposing. What is relevant is that there is a time series of global temperatures and a time series CO2 and other forcing that generated this temperature series in the model. If Beenstock et al’s method cannot find the relationship between CO2 and temperature in the model, then it cannot be trusted if it cannot find the relationship in the real world.”

    Go ahead, show that this guy was wrong.
    http://en.wikipedia.org/wiki/Granger_causality

  98. LazyTeenager says:

    I don’t understand this paper but my instincts are saying its fiddling with statistics that are divorced from the physics of the system.

    Might be illuminating to build a fake black box model that includes some degree of causality, originating from multiple sources and mushed up with random variation and multiple response time scales. Then apply this same kind of analysis to see if it correctly identifies the underlying multiple causes. If it can’t the methodology is broken.

  99. Steveta_uk says:

    Lubos, I love the idea of a test refusing to produce the required result.

    It’s similar to the use of the work “but” in logical statements. “A AND NOT B” gives exactly the same result as “A BUT NOT B”. But somehow using “BUT” implies a level of dissapointment. Like “Christmas but no presents.”

  100. Alan D McIntire says:

    “Nevertheless, there seems to be a temporary anthropogenic effect. If the effect is temporary rather than permanent, a doubling, say, of carbon emissions would have no long-run effect on Earth’s temperature, but it would in- crease it temporarily for some decades.”
    “Indeed, the increase in temperature during 1975–1995 and its subsequent stability are in our view related in this way to the acceleration in carbon emissions during the second half of the 20th century.”

    I suspect that the reason for this temporary “jump” is due to measurement bias. We measure temperatures where people LIVE and are producing energy, not in the uninhabited boondocks. When economic activity increases, we use more energy, which ultimately winds up as “waste” heat. That temporary jump is is a result of measuring the increase in local waste heat produced. No increase in economic activity implies no increase in waste heat, therefore no increase in temperatures regardless of what CO2 does.

  101. richardscourtney says:

    DeWitt Payne:

    My post at January 3, 2013 at 2:48 pm asked you to justify your silly assertion that you knew the cause of recent rise in atmospheric CO2 concentration and it said

    I don’t know if the cause of the recent rise in atmospheric CO2 concentration is entirely anthropogenic, or entirely natural, or a combination of anthropogenic and natural causes. But I want to know.

    At present it is not possible to know the cause of the recent rise in atmospheric CO2 concentration, and people who think they “know” the cause are mistaken because at present the available data can be modeled as being entirely caused by each of a variety of causes both anthropogenic and natural.
    (ref. Rorsch A, Courtney RS & Thoenes D, ‘The Interaction of Climate Change and the Carbon Dioxide Cycle’ E&E v16no2 (2005) ).

    Your reply to me at January 3, 2013 at 4:32 pm says in total

    E&E and you’re a co-author? Pull the other one.
    As long as we’re self-referencing: http://noconsensus.wordpress.com/2010/03/04/where-has-all-the-carbon-gone/

    So, you ignore peer reviewed work because I contributed to it, and you cite your blog post which is twaddle.

    The isotope data shows a change in the correct direction for it to have been induced by the anthropogenic emission (there is a 50:50 chance that it would be in the correct direction) but it has the wrong magnitude by a factor of ~6 if it were induced by the anthropogenic emission. There is no reason to suppose that any of the isotope change was induced by the anthropogenic emission when most of it cannot have been.

    The facts are that the recent rise in atmospheric CO2 concentration can be modeled in a variety of ways as having a purely natural or a purely anthropogenic cause.

    Each of the models in our paper matches the available empirical data without use of any ‘fiddle-factor’ such as the ‘5-year smoothing’ the UN Intergovernmental Panel on Climate Change (IPCC) uses to get the Bern Model to agree with the empirical data.

    So, if one of the six models of our paper is adopted then there is a 5:1 probability that the choice is wrong. And other models are probably also possible. And the six models each give a different indication of future atmospheric CO2 concentration for the same future anthropogenic emission of carbon dioxide. Three of our models assumed a purely anthropogenic cause of the recent rise in atmospheric CO2 concentration and the other three assumed a purely natural cause.

    Data that fits all the possible causes is not evidence for the true cause. Data that only fits the true cause would be evidence of the true cause. But the findings in our paper demonstrate that there is no data that only fits either an anthropogenic or a natural cause of the recent rise in atmospheric CO2 concentration. Hence, the only factual statements that can be made on the true cause of the recent rise in atmospheric CO2 concentration are

    (a) the recent rise in atmospheric CO2 concentration may have an anthropogenic cause, or a natural cause, or some combination of anthropogenic and natural causes,

    but

    (b) there is no evidence that the recent rise in atmospheric CO2 concentration has a mostly anthropogenic cause or a mostly natural cause.

    Indeed, since you don’t want to read the paper, I will mention a volcanic possibility which the paper does not mention but disproves the certainty with which you delude yourself.

    CO2 is in various compartments of the carbon cycle system, and it is exchanged between them. Almost all of the CO2 is in the deep oceans. Much is in the upper ocean surface layer. Much is in the biosphere. Some is in the atmosphere. etc..

    The equilibrium state of the carbon cycle system defines the stable distribution of CO2 among the compartments of the system. And at any moment the system is adjusting towards that stable distribution. But the equilibrium state is not a constant: it varies at all time scales.

    Any change to the equilibrium state of the carbon cycle system induces a change to the amount of CO2 in the atmosphere. Indeed, this is seen as the ‘seasonal variation’ in the Mauna Loa data. However, some of the mechanisms for exchange between the compartments have rate constants of years and decades. Hence, it takes decades for the system to adjust to an altered equilibrium state.

    The observed increase of atmospheric CO2 over recent decades could be an effect of such a change to the equilibrium state. If so, then the cause of the change is not known.

    One such unknown variable is volcanic emission of sulphur ions below the sea decades or centuries ago.

    The thermohaline circulation carries ocean water through the deeps for centuries before those waters return to ocean surface. The water acquires sulphur ions as it passes undersea volcanoes and it carries that sulphur with it to the ocean surface layer decades or centuries later. The resulting change to sulphur in the ocean surface layer alters the pH of the layer.

    An alteration of ocean surface layer pH alters the equilibrium concentration of atmospheric CO2.

    A reduction to surface layer pH of only 0.1 (which is much too small to be detectable) would induce more than all the change to atmospheric CO2 concentration of 290 ppmv to ~400 ppmv which has happened since before the industrial revolution.

    I don’t know if this volcanic effect has happened, and I doubt that it has. But it demonstrates how changed equilibrium conditions could have had the observed recent effect on atmospheric CO2 concentration whether or not there was a change in temperature and whether or not the anthropogenic CO2 emission existed.

    Simply, you are wrong. And it seems you are willfully wrong.

    Richard

  102. aaron says:

    “do not”

  103. richardscourtney says:

    davidmhoffer

    Your post at January 3, 2013 at 9:47 pm says

    Bart says:
    January 3, 2013 at 8:35 pm
    >>>>>>>>>>>>>>>>>>
    Wow. Folks, anyone who skipped through Bart’s post because it was long and technical…. I highly recommend going back and looking at those graphs.

    I strongly agree that Bart’s graphs are very informative – everybody needs to see them – but they don’t provide the complete ‘answer’ which Bart assumes.

    In my post at January 4, 2013 at 5:52 am I wrote

    The equilibrium state of the carbon cycle system defines the stable distribution of CO2 among the compartments of the system. And at any moment the system is adjusting towards that stable distribution. But the equilibrium state is not a constant: it varies at all time scales.

    Any change to the equilibrium state of the carbon cycle system induces a change to the amount of CO2 in the atmosphere. Indeed, this is seen as the ‘seasonal variation’ in the Mauna Loa data. However, some of the mechanisms for exchange between the compartments have rate constants of years and decades. Hence, it takes decades for the system to adjust to an altered equilibrium state.

    Bart’s graphs show how the short-term processes immediately respond to the altered system state induced by temperature change.

    At issue is the long-term trend in rising atmospheric CO2 concentration.
    The dynamics of the system show that the carbon cycle can easily sequester ALL annual CO2 emission (both natural and anthropogenic) of each year, but the long-term rise shows that they don’t. At issue is why they don’t.

    The reason for the long-term rise in atmospheric CO2 is probably that some mechanisms of the climate system take decades to fully adjust to an altered system state. Indeed, the ice core records indicate that some mechanisms take centuries to adjust.

    There are many possible reasons why the equilibrium state of the carbon cycle has changed: most possibilities are natural phenomena, but the anthropogenic emission is one (improbable) possible reason.

    Richard

  104. Joe says:

    richard telford says:
    January 3, 2013 at 4:47 pm
    […} If Beenstock et al’s method cannot find the relationship between CO2 and temperature in the model, then it cannot be trusted if it cannot find the relationship in the real world.

    Sorry, Richard, but that’s a complete logical falacy and displays a serious misunderstanding about the nature of scientific testing (whether statistical or physical). There are many valid tests which have assymetric reliability for positive and negative results.

    That’s why two different types of error (type 1 and type 2) exist. . As long as the result a test gives is of the type for which the test is reliable, it doesn’t matter at all what the likelihood of false results of the other type are.

    In this case, what that means is that the analysis may well falsely indicate “a relationship” in random data but won’t (or is very unlikely to) indicate “no relationship” in data where causality does exist. So getting a result of “no relationship” in this case is a reliable indication that the data are NOT connected even though it would NOT have been reliable indication that they were connected if it had given a result of “relationship”.

  105. Resourceguy says:

    To those negative comments on my methodological reference to the world’s central banks, perhaps you are also confused between regulatory and legislative loopholes in the financial sector and central bank operation. In that sense it is much like the assessment of climate variables in which there is disagreement on what happened even in hindsight. I stand by the soundness of the statistical technique in the paper and its common use in other research fields.

  106. Joe says:
    January 4, 2013 at 6:25 am
    richard telford says:
    January 3, 2013 at 4:47 pm
    […} If Beenstock et al’s method cannot find the relationship between CO2 and temperature in the model, then it cannot be trusted if it cannot find the relationship in the real world.

    Sorry, Richard, but that’s a complete logical falacy and displays a serious misunderstanding about the nature of scientific testing (whether statistical or physical). There are many valid tests which have assymetric reliability for positive and negative results.
    —————
    Since there is nothing wrong with what you wrote, and I don’t say anything about asymmetrical reliability, I can only assume that you misunderstood what I wrote.

    Beenstock et al does not explored the Type II error rate of their method. Therefore when they find no relationship, how sure can we be that there is no relationship and not that the apparent absence of a relationship is because their method has little statistical power. I would not be in the least surprised if their method had little power. They would not be the first people to proclaim an important negative result while using a low-powered method.

    I am simply proposing a means by which the Type II error rate of their method could be established. If they could demonstrate that their method had high power on artificial data, more credibility could be given to their analysis on real data.

  107. DeWitt Payne says:

    richardscourtney,
    E&E will publish pretty much anything. Saying your paper is peer reviewed does not put it in the same league as the papers in, for example, Wigley and Schimel. Gerlich & Tscheuchner’s falsification paper and Miskolczi’s papers were similarly peer reviewed. They’re still wrong. In the end, many peer reviewed papers in the mainstream journals will turn out to be wrong. Sturgeon’s Law (or Revelation) is that 90% of everything is crud. I haven’t read your paper, but I’m betting that you cite Beck and/or Jaworowski. If that is the case, then your paper is definitely in the 90% category.

    The fact is that human emissions of CO2 are more than enough to explain the increase in atmospheric CO2. And the model using only human CO2 emission fits the observed levels very well. Any natural process would have to alter that relationship. The only alteration observed is the so-called missing sink. That caused a reduction in the rate of atmospheric CO2 concentration increase, not an increase in the rate.

  108. Joe says:

    DeWitt Payne says:
    January 4, 2013 at 7:31 am

    The fact is that human emissions of CO2 are more than enough to explain the increase in atmospheric CO2. And the model using only human CO2 emission fits the observed levels very well.

    The model of the world being flat was more than enough to explain the observation that sailors never returned from over the horizon. Didn’t make it right though!

  109. DeWitt Payne says:

    Joe,

    In this case, what that means is that the analysis may well falsely indicate “a relationship” in random data but won’t (or is very unlikely to) indicate “no relationship” in data where causality does exist. So getting a result of “no relationship” in this case is a reliable indication that the data are NOT connected even though it would NOT have been reliable indication that they were connected if it had given a result of “relationship”.

    I suggest you research the various unit root tests and cointegration theory. Beenstock, et.al. do not find that there is no relationship. They find that any relationship must be spurious because of the structure of the time series. But time series testing for unit roots has problems when there is a non-linear deterministic trend in the data. The tests will find unit roots when none are actually present. Worse, there is no consensus on whether or how to remove deterministic trends before testing.

    In fact, there is good reason to believe that the unforced temperature series cannot have d > 0.5, the limit for long term persistence. Thus finding values of d ~ 1 should have been a red flag that the tests were being improperly applied and/or that the series was probably being forced. But Beenstock, et. al. are economists not physical scientists.

    You really should read the article that was linked earlier. It’s a complete rejection of cointegration theory in no uncertain terms.

  110. richardscourtney says:

    DeWitt Payne:

    Your post at January 4, 2013 at 7:31 am consists solely of more unsubstantiated assertion from you.

    Clearly, facts and evidence have no possibility of breaking through the armour you have put around your beliefs. You are entitled to believe whatever you want, but I prefer to consider the science and what it indicates.

    You say

    The fact is that human emissions of CO2 are more than enough to explain the increase in atmospheric CO2. And the model using only human CO2 emission fits the observed levels very well.

    Yes, the “human emissions of CO2 are more than enough to explain the increase in atmospheric CO2″: I said that. And I also stated the fact that many natural effects also explain the increase in atmospheric CO2 much better, but you ignore that fact because it does not fit with what you want to believe.

    And you don’t say which model you mean when you say “the model using only human CO2 emission fits the observed levels very well”. If you mean the Bern Model then it doesn’t fit the observed levels: it requires unjustifiable smoothing of the data to make it fit.

    Our paper provides three models which each uses only human CO2 emission and each fits the observed levels perfectly within the measurement errors and with no smoothing. But so what? Our paper also provides three models which each has the change induced by a different natural cause and they each also fit the observed levels perfectly within the measurement errors and with no smoothing.

    As I said,

    Data that fits all the possible causes is not evidence for the true cause.

    I want to know the true cause(s).

    Reality is what it is, and your beliefs cannot change reality whatever it is.

    Richard

  111. richardscourtney says:

    DeWitt Payne:

    This is an addendum to my reply to your post at January 4, 2013 at 7:31 am as substantiation of my claim concerning your beliefs.
    You say to me

    I haven’t read your paper, but I’m betting that you cite Beck and/or Jaworowski. If that is the case, then your paper is definitely in the 90% category.

    Our paper mentions neither Beck and/or Jaworowski.
    You would have been able to assess the paper if you had read it.

    Your words I quote here are an example of you ‘making stuff up’ in fallacious attempt to justify your fallacious assertions.

    If you had an argument worth making then you would make it instead of inventing things in your mind as self-serving justification of your assertions. Those assertions can only be beliefs because they are based solely on assertions justified by untrue assumptions.

    Richard

  112. Philip Shehan says:

    Quoting from the paper:

    “3.1 Time series properties of the data

    Informal inspection of Fig. 1 suggests that the time series properties of greenhouse gas forcings (panels a and b) are visibly different to those for temperature and solar irradiance (panel c). In panels a and b there is evidence of acceleration, whereas in panel c the two time series appear more stable.”

    Informal inspection of the temperature data of panel c does show acceleration, matching that of the greenhouse gas forcing plots in a and b. The temperature rise appears less dramatic due to different scaling factors used in the 3 plots, but the acceleration of the temperature in the last 40 years compared to the previous 80 is clear to the naked eye. This is confirmed by a formal fit of temperature data to a nonlinear equation.

  113. Philip Shehan says:

    Apologies for not including the nonlinear plot in the previous post.

    http://www.skepticalscience.com/pics/AMTI.png

  114. brians356 says:

    A little late to the party here, but my friend in DC (must remain anonymous, but is an energy division lead economist for a prominent three-letter agency) says:

    “This is interesting. I have no idea what climate change modelers have done but if their claims of causality in an empirical sense have not taken tests for stationarity of the underlying time series then the regression results would be possibly meaningless. A big if. This is not new statistics and I doubt folks have ignored it. But, like I said, I do not know what the empirical climate models have done.”

    Any germane comments welcome.

  115. Joe says:

    DeWitt Payne says:
    January 4, 2013 at 7:55 am

    lots of irrelevent stuff

    In case you hand’t noticed, my post was nothing to do with the validity or otherwise of the paper’s findings. I’ll leave that up to people far more qualified than me (or, likely, you) to determine.

    But Richard Telford’s original post (which I ignored the fallacy in) and his follow-up contained a very basic logical fallacy that a test can’t be any good unless it provides reliable results in both directions – hence his concern about type 2 error levels in the original, when type 2 errors play no part in the vlaidity regarding type 1.

    I considered building a nice analogy to demonstrate the flaw in his reasoning but decided it was easier, and more relevant, to explain it in terms of the paper under discussion. To explain the logical flaw in requiring tests to have equal (or even known) errors of both types didn’t require any discussion about whether or not the analysis in the paper is appropriate. Indeed, introducing such discussion would only obfuscate the point I was explaining to Mr Telford.

    Perhaps you should try to fully understand what people are saying before you expect them to accept your own points. After all, given your apparent mis-comprehension of my post, one does wonder how much you might actually comprehend (as opposed to simply repeating from somewhere the far more technical matters that you’re using to criticise the paper?

  116. MattS says:

    richardscourtney,

    “Those assertions can only be beliefs because they are based solely on assertions justified by untrue assumptions.”

    They would still only be beliefs even if they were backed by true assumptions. It only matters that what backs the assertion is an assumption rather than evidence.

    :)

  117. richardscourtney says:

    MattS:

    re your post addressed to me at January 4, 2013 at 9:07 am.

    Yes, of course you are right. I stand corrected. Thankyou.

    Richard

  118. Gary Pearse says:

    “While much of the scientific research into the causes of global warming has been carried out using calibrated gen- eral circulation models (GCMs), since 1997 a new branch of scientific inquiry has developed in which observations of climate change are tested statistically by the method of cointegration.”

    Gee a new branch of inquiry based on observations – this is the achilles heel that the hockey team will exploit in debunking this upstart idea.

  119. Bart says:

    JazzyT says:
    January 4, 2013 at 12:12 am

    “there will be a 180 degree phase shift due to the 24-month average”

    The WoodForTrees site automatically shifts the moving average to have zero phase offset.

    “But anthropogenic CO2 would show up mostly as a steady value (an offset) in the derivative of CO2, if it’s the relatively constant rise that people seem to think that it is, and that the Keeling curve indicates.”

    Anthropogenic CO2 would show up as a trend in the CO2 derivative, because production has been steadily increasing. There is no room for such an additional term, because the slope is already accounted for by the temperature relationship.

    “Measurements of carbon isotopes in atmospheric CO2 show a decreasing concentration of C-13, consistent with the notion that increased CO2 arises from these anthropogenic sources.”

    “Consistent with” is not proof. The derivative relationship I have shown reveals that the consistency is spurious happenstance.

    richardscourtney says:
    January 4, 2013 at 6:21 am

    “…but they don’t provide the complete ‘answer’ which Bart assumes.”

    We’ve been over this many times and are not going to agree. But, for the record, I do not assume, I observe. The match is virtually perfect and seamless across the observable frequency spread. It is clear that temperature is in the driver’s seat.

    DeWitt Payne says:
    January 4, 2013 at 7:31 am

    “The fact is that human emissions of CO2 are more than enough to explain the increase in atmospheric CO2.”

    The fact is, this tells you nothing about whether it is responsible for it, only whether it could be.

    “And the model using only human CO2 emission fits the observed levels very well.”

    It fits very poorly in the fine detail. As I show, the model using temperature only fits the observed levels very well, too. But across all frequencies, not just in the quadratic term.

    “Any natural process would have to alter that relationship.”

    The relationship is spurious. It is happenstance. And, it is not at all an unlikely thing to have two increasing time series match a low order polynomial when you can add an arbitrary offset and scaling.

    MattS says:
    January 4, 2013 at 9:07 am

    “richardscourtney,

    “Those assertions can only be beliefs because they are based solely on assertions justified by untrue assumptions.”

    And, so we reach a state in which an erroneous conclusion propagates from an initial erroneous conclusion which gets all but forgotten, and is always referred to, but never reexamined. A review of Feynmann’s recounting of the measurement of electron charge might be in order. Nobody wanted to go too far from Millikan’s value. Scientists are social animals, too, and they often seek safety in the herd.

  120. Tom in Indy says:

    Phillip

    Given your claim that temperature change is accelerating over the last 40 years, the lack of acceleration over the last 15 years, nearly 40% of the period in question, contradicts your claim. Try fitting a linear trend, a concave trend and your convex trend to the data for the period 1970-2012 and report back with the R2. I have a hunch that your convex trend will produce the poorest fit.

  121. richardscourtney says:

    Bart:

    I am replying to a comment in your post at January 4, 2013 at 9:58 am for the information of others. You say

    richardscourtney says:
    January 4, 2013 at 6:21 am

    …but they don’t provide the complete ‘answer’ which Bart assumes.

    We’ve been over this many times and are not going to agree. But, for the record, I do not assume, I observe. The match is virtually perfect and seamless across the observable frequency spread. It is clear that temperature is in the driver’s seat.

    Yes, we have “been over this many times” and it is clear that we “are not going to agree”.

    I have quoted your view here and my view is explained in my post at January 4, 2013 at 6:21 am which you cite.

    There are those (e.g. DeWitt Payne) who state certainty that the recent rise in atmospheric CO2 concentration has an anthropogenic cause. And there are others (e.g. yourself) who state certainty that the recent rise in atmospheric CO2 concentration has a natural cause.

    I remain ‘on the fence’ about the causality until I see data which convinces me to ‘get off the fence’ on one side. Your data convinces you but not me that I should ‘get off the fence’ on your side.

    Richard

  122. newcanf says:

    DeWitt Payne.

    You are jumping the shark here. Cointegration is a longstanding and mainstream method; Granger and Engle won the Nobel prize for their work in pioneering the field. To rebut this, you repeatedly link to a single paper in a minor finance journal (as a financial professional, I have never even heard of the journal). According to Google Scholar, the paper has been cited a total of three times – all three by the author himself! Given the hundreds of papers published on cointegration in any year, this is a not very impressive achievement. So you disparage E&E, but somehow place significant reliance on this fringe paper in a fringe journal. Not very consistent.

  123. Philip Shehan says:

    Sorry if this is a repost but I think I messed up the first attempt.

    Tom,

    I was commenting on the authors statement about their figure presenting their data from 1880 to the present. That is their chosen data set.

    As a scientist I am used to looking at such graphs but believe that even “informal” examination by the untrained eye can discern an accelerating trend in the data. In case some people were having trouble I simply suggested concentrating on the last 40 years of data compared to the previous 80 (actually 90) and “informally” making a linear fit with the minds eye. This impression can be confirmed by actual linear fits to temperature data for the period 1880 to the present, 1880 to 1969, and 1970 to the present.

    http://www.woodfortrees.org/plot/gistemp-dts/from:1880/to:2013/plot/gistemp-dts/from:1970/to:2013/trend/plot/gistemp-dts/from:1880/to:1969/trend/plot/gistemp-dts/to:1880/to:2013/trend.

    The fit for all the data is clearly inferior to the non linear fit. (Unfortunately the r square values for the linear fits are not given, nor is the function for the non linear plot but it appears to be second order polynomial or exponential.)

    http://www.skepticalscience.com/pics/AMTI.png

    With regard to a nonlinear fit for the past fifteen years, temperature data is much noisier than greenhouse gas concentration as the former is also dependent on factors such as solar output, volcanic eruptions, el nino and la nina events to name some of the most significant. Temperature trends must be analysed over multidecadal time periods. The noisy data means that the linear function from 1970 to the present is reasonable but is inferior to the nonlinear fit over he longer period.

  124. D Böehm says:

    Philip Shehan says:

    “Informal inspection of the temperature data of panel c does show acceleration…”

    Wrong.

    But I knew this would happen. As Werner Brozek repeatedly shows in great detail and based on extensive data sets, there is no recent acceleration of global temperatures. Faced with that undeniable fact, the alarmist crowd has one of two choices:

    1. Admit that despite the rise in CO2, there has been no acceleration of global temperatures, and reassess their failed conjecture, or…

    2. Lie about it.

    Global temperatures are not accelerating. In fact, as the WFT chart shows, global warming has stopped for the past decade and a half. Claiming that global temperatures are “accelerating” when the data shows otherwise is pure mendacity.

  125. Philip Shehan says:

    D.Boehm,

    All I can do is redirect you to my 11.37 PM post.

    Again I am specifically analysing the data and the claims made for it from 1880 by the papers authors. I have tried to be polite but since you are implying I am a liar, only wilful self delusion, ignorance or dishonesty can lead you ignore mathematical analysis of the entire data set and cherry pick a 5% segment of the total data carefully selected to begin with an extreme el nino southern summer of 1997-98, which in no way invalidates the 130 year trend.

    Why didn’t you pick the 15 year period between 1940 and 1955 to prove that temperatures from 1880 to present have been dropping?

    http://www.woodfortrees.org/plot/gistemp-dts/from:1940/to:1955/plot/gistemp-dts/from:1940/to:1955/trend

  126. Matthew R Marler says:

    When the paper was first put up on Beenstock’s web page I bought a couple of books on the topic of non-linear co-integrated vector autoregressive (VAR) processes. Linear co-integrated VAR processes have been studied for decades. Except for the possibility of programming errors (and I hope that the authors follow a recently and widely but not universally promoted standard of putting all of their code, data, intermediate results, etc on line), I have two criticisms of the paper:

    1. The standard: it is really hard to infer causation from vector time series without interventions (interventions can be conducted in chemical process control, where the VAR processes have been used with success.) All they have shown, with that caveat in mind, is that it is possible, contrary to a claim by IPCC AR(4), to create and estimate a reasonable model for 20th century temperature change that gives little or no weight to CO2 changes. In a sense, this is a complicated counterpoise to Vaughan Pratt’s modeling of a few weeks ago, in which he showed that: assuming a functional form for the CO2 effect he could estimate a filter to reveal that functional form. In each case, by enlarging the total field of functions under consideration, you can generally get a model to justify any a priori chosen conclusion.

    2. I would like to have seen more graphs displaying the estimated non-linear relationships between the measured variables at each time point and the full set of variables at each lag: model and data. This is among the things that I hope they provide on line, but if they put up their data, model and results it will be possible for others (maybe I will) to produce those plots.

    I think the paper is a solid contribution to the topic of modeling multivariate climate data. Now that their model has been published, its predictions can be updated as new data on CO2 concentrations and solar indices become available, and we can see how well it does on “1-year ahead”, “5-year ahead” and “10-year ahead” predictions without changing model parameters. As with all of the other models, if the mean square prediction error is small enough, we may begin to rely on its predictions.

  127. Matthew R Marler says:

    Richard Telford: Beenstock et al does not explored the Type II error rate of their method. Therefore when they find no relationship, how sure can we be that there is no relationship and not that the apparent absence of a relationship is because their method has little statistical power. I would not be in the least surprised if their method had little power. They would not be the first people to proclaim an important negative result while using a low-powered method.

    I agree with you.

    I do not consider that in evaluating whether the paper is good, because it is already a long and technically dense paper, and I think that topic can be addressed later. However, I also think that it is possible to elaborate modeling sufficiently to achieve almost any desired result on extant data, so I believe that potential type 1 and type 2 error rates for particular hypotheses are 1. Not for a particular test, but for the procedure of multiple model fitting and multiple testing. Nobody is naive any more, the authors had already thought about potential models and what might produce null results for particular tests (I would bet) even before they started modeling. With so many people having already done so much modeling on so much data, the only hope for model comparisons and hypothesis tests must depend on future data.

  128. DeWitt Payne says:

    richardscourtney,

    I’m feeling the need for amusement, but there’s no way I’m going to spend 18£ to purchase a copy of your article. Put a pdf on line somewhere and post a link.

  129. Philip Shehan says:

    Pardon the error above. 15 out of 130 years is 11%

  130. RACookPE1978 says:

    Philip Shehan says:
    January 4, 2013 at 1:02 pm

    I have tried to be polite but since you are implying I am a liar, only wilful self delusion, ignorance or dishonesty can lead you ignore mathematical analysis of the entire data set and cherry pick a 5% segment of the total data carefully selected to begin with an extreme el nino southern summer of 1997-98, which in no way invalidates the 130 year trend.

    Why didn’t you pick the 15 year period between 1940 and 1955 to prove that temperatures from 1880 to present have been dropping?

    Because you see, we are trying to determine why the 1650 – 2000 year trend of slowly rising temperatures – all from natural causes unrelated to a CO2 increase between 1950 and 2013 – of 360 years length is being ignored by those whose funding and power and future promotions and employment are threatened by the evidence; while you are trying to force a rising-CO2 and rising-temperature relationship valid ONLY from 1973 to 1998 onto a 130 year period.

    Further, why is a single 25 year period (1973 – 1998) “valid, critical, and worth destroying the world’s economy” about (while killing millions) important, while a 15 year period 1997 – 2012 ignored? the warming slope STOPPED in 1998. Why can you not recognize that fact? Whether it will begin again we do not know – but you cannot pretend that warming – and increase in temperatures – is continuing. The solar experts are predicting a 1.5 degree temperature drop over the next solar cycles … are you mentally ready for that?

    Don’t blame aerosols either – the Mauna Loa atmospheric visibility index remain unchanged since before 1950, (other than two volcanic peaks).

    Is your funding, your life, your health threatened by the 15 year period, but you are content urging the absolute, immediate and assured death of millions because of that one 25 year period out of 350 that you fear “might” cause minor problems 90 years from now? I could have saved millions from poverty, hunger, and disease through better water, better transportation, better heat, better food preparation and storage, and better shelter and clothing for a fraction of what YOU wasted on CAGW trips and politics alone. Today’s worldwide economic crisis BEGAN in the energy policies demanded by those in power in academia and the media and politics who used YOUR typical AGW propaganda to destroy energy production and movement.

    In the meantime, while you fear a 1/10 0f 1% chance of a “might be a problem” for 20,000 you guarantee disaster now for billions more.

  131. DeWitt Payne says:

    Here’s a Ph.D. dissertation on using VAR-ML to model temperature vs CO2 among other things. Conclusions include that there is two way Granger causality between temperature and CO2, particularly including the glacial epochs. Somebody, not me, might want to point this out to Willis E., who seems to think that Granger causality in this case is only one way.

  132. richardscourtney says:

    DeWitt Payne:

    Your post at January 4, 2013 at 1:23 pm says to me in total

    I’m feeling the need for amusement, but there’s no way I’m going to spend 18£ to purchase a copy of your article. Put a pdf on line somewhere and post a link.

    You make two points and I address each of them.

    If you want “amusement” then I suggest you read the posts you have made in this thread because you may give you the belly laugh which they give me.

    If you don’t want to obtain a copy of our paper then watch the lecture by Murray Salby at
    http://www.youtube.com/watch?v=ZVCps_SwD5w&feature=autoplay&list=PLILd8YzszWVTp8s1bx2KTNHXCzp8YQR1z&playnext=2

    Salby’s lecture says the same as the analysis in our paper except that
    (a) Salby does not conduct the attribution studies which we included to demonstrate the findings,
    (b) we did not make Salby’s assessment of soil moisture effects, and
    (c) Salby concludes that because natural changes can be the sole cause then they are the cause of the recent rise in atmospheric CO2, but we were not willing to accept that because – although unlikely – the anthropogenic emission could also be the cause.

    Indeed, Salby uses some very similar words to paragraphs in our paper: n.b. this is NOT an accusation of plagiarism: clear statement of the same facts is likely to use the same or similar words.

    A summary of Salby’s lecture with his main slides is at
    http://hockeyschtick.blogspot.co.uk/2012/09/climate-scientist-dr-murry-salby.html

    Watch the lecture, check its facts, and you may learn something despite your prejudice.

    Richard

  133. willb says:

    DeWitt,

    I read your article (“Where Has All the Carbon Gone?”) at the Air Vent. It’s an informative article, with interesting results and interesting comments. I’m not sure it really provides much insight or evidence for Ferdinand’s Mass Balance argument, though. The Bern Carbon Cycle model you investigated in that article is for the most part an empirical model, is it not? I believe it is constructed under the assumption that humans are the cause of the recent increase in atmospheric CO2. I don’t think you can use an empirical model (which is tuned to provide a good fit to the data) as evidence to support one of its own input assumptions.

    Regarding isotope ratio, the Earth has been sequestering carbon through the biosphere for billions of years. IMHO just about any reasonable terrestrial source for increased atmospheric CO2 will have been ultimately filtered by plants and would therefore likely have an isotopic ratio similar to that of burning fossil fuels.

    The oxygen depletion evidence is also speculative and far from overwhelming. Correlation of oxygen depletion with fossil fuel burning is not that great and IMHO it is just as likely (and just as speculative to conclude) that the depletion is due to land use changes, or perhaps simply due to natural changes in the biosphere as temperature and CO2 rise.

  134. Philip Shehan says:

    RACookPE1978 says:
    January 4, 2013 at 1:26 pm….

    Once again.

    I am commenting on the claims made in the paper for the temperature record from 1880. The authors’ claim that their is no accelerated warming over that period is not supported by examination of linear and non linear fits for that period.

    My rhetorical question to D Boehm about why the 15 year period is any more indicative of a 130 year trend than the period 1945 -50 is intended to show that cherry picking data gives no indication whatsoever of the long term trend.

    And I do not recognise the fact that warming has stopped since 1998 because it has not.

    You are not just cherry picking a 15 year period to arrive at that conclusion, you are cherry picking the extreme southern el nino summer of 1997/1998. One summer does not a trend make.

    Since 1996 there has been a warming trend of 0.1 C per decade. Since 1999 there has been warming trend of 0.1 degree per decade. And this even with the cololing la nina weather pattern for the last two years.

    http://www.woodfortrees.org/plot/wti/from:1995/to:2013/plot/wti/from:1996/to:2013/trend/plot/wti/from:1998/to:2013/trend/plot/wti/from:1999/to:2013/trend

    So explain to me how there can have been no warming since 1998, but warming since 1996 and warming since 1999.

    By the way I am writing from Australia where the Bureau of meterology is debating whether the end of the el nino period means we are reverting to normal warmer and drier conditions or whether we are in for above average heat.

    As far as tipping future temperature trends goes, it is unscientific to put too much reliance on individual weather events. That said, the entire continent is in the grip of a heat wave. It was 106 F here in Melbourne yesterday (way down south, the cool part) while bushfires are burning in Tasmania (the island state further south). Last night I have been unable to sleep with the heat, so typing out these pearls of wisdom in the night. So unscientific or not, my money is on a hot 2013.

  135. John Whitman says:

    From the Discussion section of ‘Polynomial cointegration tests of anthropogenic impact on global warming’ by M. Beenstock, Y. Reingewertz, and N. Paldor , published in the journal Earth System Dynamics

    {all emphasis by me, John Whitman}

    The fact that since the mid 19th century Earth’s temperature is unrelated to anthropogenic forcings does not contravene the laws of thermodynamics, greenhouse theory, or any other physical theory. Given the complexity of Earth’s climate, and our incomplete understanding of it, it is difficult to attribute to carbon emissions and other anthropogenic phenomena the main cause for global warming in the 20th century. This is not an argument about physics, but an argument about data interpretation. Do climate developments during the relatively recent past justify the interpretation that global warming was induced by anthropogenics during this period? Had Earth’s temperature not increased in the 20th century despite the increase in anthropogenic forcings (as was the case during the second half of the 19th century), this would not have constituted evidence against greenhouse theory. However, our results challenge the data interpretation that since 1880 global warming was caused by anthropogenic phenomena.

    Nor does the fact that during this period anthropogenic forcings are I (2), i.e. stationary in second differences, whereas Earth’s temperature and solar irradiance are I (1), i.e. stationary in first differences, contravene any physical theory. For physical reasons it might be expected that over the millennia these variables should share the same order of integration; they should all be I (1) or all I (2), otherwise there would be persistent energy imbalance. However, during the last 150 yr there is no physical reason why these variables should share the same order of integration. However, the fact that they do not share the same order of integration over this period means that scientists who make strong interpretations about the anthropogenic causes of recent global warming should be cautious. Our polynomial cointegration tests challenge their interpretation of the data.

    Finally, all statistical tests are probabilistic and depend on the specification of the model. Type 1 error refers to the probability of rejecting a hypothesis when it is true (false positive) and type 2 error refers to the probability of not rejecting a hypothesis when it is false (false negative). In our case the type 1 error is very small because anthropogenic forcing is I (1) with very low probability, and temperature is polynomially cointegrated with very low probability. Also we have experimented with a variety of model specifications and estimation methodologies. This means, however, that as with all hypotheses, our rejection of AGW is not absolute; it might be a false positive, and we cannot rule out the possibility that recent global warming has an anthropogenic footprint. However, this possibility is very small, and is not statistically significant at conventional levels.

    - – - – - – - – - -

    Their conclusion’s significance lies in demonstrating that observations (time series data) cannot with significant confidence be interpreted as supporting a strong case for anthropogenic causes of recent global warming. There should be significant caution about any interpretations that support significant AGW causations.

    And, I find they are bending over pretty far backwards to show how they might be wrong. Good for them. As part of that they directly allow for physical interpretation as well as data interpretation. I think the paper is a positive step in right path toward more analysis based on both its achievements and its shortcomings. Look forward to papers relating to it whether they support it or not.

    Some might criticize them about the data sets they chose. But, to do exactly what they did with different data sets is very facilitated by their willingness to have the paper completely open and transparent in all respects.

    It would interest me if the very same kind of study could be done over both the Holocene and also for the last 2000 yrs. Does anyone know if anyone has done such similar statistical studies or if someone is in the process of doing them?

    John

  136. D Böehm says:

    Philip Shehan,

    Your comment regarding ‘accelerating’ temperatures was pretty self-explanatory. However, temperatures are not accelerating. The only way to show they are is with a false artifact using a cherry-picked chart.

    The longer the warming trend shown, the better. Here is a chart going back to 1850 and showing steadily rising global temperatures as the planet recovers from the LIA.

    Note that there is no acceleration of global warming. None. The planet has been warming along the same trend line [the declining green line] for hundreds of years, and global warming has not accelerated. In fact, rather than accelerating, global warming has stopped for the past decade and a half.

    There are any number of records that show that long term global warming has remained on the same trend line, and within well defined parameters. That trend has not changed despite the ≈40% rise in CO2.

    Conclusion: the effect of CO2 is vastly overstated.

    In fact, there is no evidence that the rise in CO2 does not have a cooling effect. The only empirical evidence we have shows that ∆CO2 follows. ∆T. There are no empirical measurements showing AGW. Thus, AGW is merely a conjecture, and until/unless it is reliably measured, no more public money should be wasted on such ‘climate studies’. Sorry if that gores your ox.

  137. DirkH says:

    Philip Shehan says:
    January 4, 2013 at 2:35 pm
    “As far as tipping future temperature trends goes, it is unscientific to put too much reliance on individual weather events. ”

    I disagree completely. What the CO2AGW scientists need to show is evidence for the positive water vapor feedback thermal runaway. For this, they need the following conditions:
    No wind
    High humidity
    Elevated CO2
    Insolation, no clouds

    These conditions are the testbed. We should see a local thermal runaway within minutes, as radiative energy exchange is a fast process. This would be a local weather event not possible in the past, and proof for the thermal runaway conjecture. Specifically, we should see an absolute all time high temperature record for the continent on which it happens.

    Good luck finding one.

  138. John Whitman says:

    From the Abstract section of ‘Polynomial cointegration tests of anthropogenic impact on global warming’ by M. Beenstock, Y. Reingewertz, and N. Paldor , published in the journal Earth System Dynamics

    This implies that recent global warming is not statistically significantly related to anthropogenic forcing. On the other hand, we find that greenhouse gas forcing might have had a temporary effect on global temperature.

    From the Discussion section of ‘Polynomial cointegration tests of anthropogenic impact on global warming’ by M. Beenstock, Y. Reingewertz, and N. Paldor , published in the journal Earth System Dynamics

    However, we find that greenhouse gas forgings might have a temporary effect on global temperature.

    The implication of our results is that the permanent effect is not statistically significant. Nevertheless, there seems to be a temporary anthropogenic effect. If the effect is temporary rather than permanent, a doubling, say, of carbon emissions would have no long-run effect on Earth’s temperature, but it would in- crease it temporarily for some decades.

    - – - – - – - -

    The possibility of a relatively ‘temporary’ anthropogenic effect on temperature rather than a relatively ‘permanent’ effect has suggestive implications:

    a – Reconsideration needed about the adequacy of the carbon cycle being promoted via IPCC assessment reports. This paper is a challenge to that IPCC endorsed carbon cycle that carbon cycle that supports the case for anthropogenic cause of warming.

    b – Wondering what it could mean if it is verified that a change in the rate of change of CO2 concentration may cause a ‘temporary’ change in temperature but a change in CO2 concentration may not cause any temperature change.

    c – Is (b) possibly the CO2 lagged increase from natural caused heating/miniaturization of soil and heating of oceans?

    John

    ["Forcings" instead of "forgings" in their quoted text? Mod]

  139. DeWitt Payne says:

    richardscourtney,

    Salby’s slides were indeed as amusing as I expected. You do know that the seasonal and annual variability is imposed on a much larger trend, don’t you? Removing that trend is rather like ignoring an elephant in the room.

    But if the world ocean is a source rather than a sink, how come the CO2 concentration at the South Pole station is lower than at Mauna Loa, which is lower than Barrow while the SH has a much lower percent land cover than the NH? The ocean as a sink is a much better explanation for the decrease in seasonal variability from Barrow to Mauna Loa to the South Pole as well as the concentration gradient. And you still haven’t explained where the 32 Gt of CO2 emitted in 2011, for example, goes. CO2 at Mauna Loa went up by 1.82 ppmv in 2011, that’s equal to about 15 Gt of CO2. I know some of you think mass balance is disreputable black magic, but it’s a standard tool in Chemistry and Chemical Engineering. Short of a nuclear reaction, matter must be conserved. The simplest explanation is that 32 Gt of CO2 went into the atmosphere while the biosphere and the oceans absorbed 17 Gt from the atmosphere. And the rate of removal is indeed proportional to the increase in atmospheric concentration, as one would expect from an equilibrium process. For example: CO2 emissions in 1965 was about 12 Gt while the increase in CO2 concentration was 1.34 ppmv and the concentration was 321 ppmv. That’s about 1 Gt absorbed and 11 Gt left in the atmosphere.

    Also, since a lot of you believe that the global temperature hasn’t changed since 1998, how come the CO2 concentration is still going up? The lag time of the ocean isn’t a few years, btw, it’s closer to 2000 years and ocean temperature hasn’t gone up as fast as the land temperature. The dissertation I linked to above had the sensitivity of CO2 to temperature as 2.3 ppmv/degree. Of course CO2 is going up because global emission of CO2 has been increasing.

  140. Philip Shehan says:

    D Böehm says:
    January 4, 2013 at 2:46 pm…

    Oh please. Now I really don’t want to be rude but the presentation of your temperature data to include offsets for no other reason than to stretch the y axis to apparently flatten the real data is a cheap and ridiculously obvious attempt at deception that just gets my scientist dander up.

    Once again the data presented realistically with linear and non;inear fits, both showing accelerataion of the warming trend for the period under discussion:

    http://www.woodfortrees.org/plot/gistemp-dts/from:1880/to:2013/plot/gistemp-dts/from:1970/to:2013/trend/plot/gistemp-dts/from:1880/to:1969/trend/plot/gistemp-dts/to:1880/to:2013/trend

    http://www.skepticalscience.com/pics/AMTI.png

  141. E.M.Smith says:

    I see the C isotope wars have broken out again.

    http://chiefio.wordpress.com/2009/02/25/the-trouble-with-c12-c13-ratios/

    We simply can’t know the source of CO2 from isotope ratios. Long list of problems, but one simple one is that we don’t know the isotope ratio in the fuels that were already burned ( it varies from fuel deposit to fuel deposit) and we have no clue what the ratio is from large quantities vented from geologic processes such as the mid-ocean vents, and even liquid BLOBS of CO2 seeping from the ocean at depths where it stays a liquid. And a whole lot more.

    Oh, and CO2 just oozes from the ground all over the planet due to geological processes. In enough concentration to sporadically kill a lot of people and animals. Yellowstone, Mammoth, and many others have CO2 “issues” and have had to close off area or had animal kills. So unless you know the reason and amount of volcanic / geologic cyclicals and the variations in isotope ratios all over the world (including under the oceans) you can’t say squat. Similarly, do you know the degree to which plankton bloom live and die? That cycles a load of CO2 as well. How about “Fish gut rocks”? Nobody even knew they existed a few years ago. Turns out many ocean fish excrete carbonate deposits and poop them to the ocean floor. Now we’ve hauled vast quantities of fish out of the global oceans so there are a lot less rock-poopers “doing their thing”, which means less CO2 sequestration. Numbers? Hey, folks just figured out lately this was happening at all… but it’s big… and more…

    Then there are the MASSIVE quantities of carbonate washed into the ocean every year from erosion of the rocks of the continents. Care to guess what they do? Yes, guess. Not going to have any actual way to say since it is highly variable and subject to a lot of estimation:

    http://chiefio.wordpress.com/2011/12/12/ocean-carbonate-from-rocks/

    Got any idea what the isotope ratios are for all the rock sources on the planet? Didn’t think so…

    (It will vary, as some are ancient and some are freshly made, like those gut rocks or clam shell middens the Native Americans piled all over Florida.)

    You can start to get an idea of the problem by looking at just some of the sources and sinks:
    http://chiefio.wordpress.com/2010/10/17/where-co2-goes/

    But since there are ‘lakes’ of liquid CO2 on the bottom of the ocean, I think that’s going to be hard to do:
    http://chiefio.wordpress.com/2011/12/10/liquid-co2-on-the-ocean-bottom/

    A team of scientists based in Japan and Germany has found an unusual “lake” of liquid carbon dioxide beneath the ocean floor.

    Shallow Lake

    Inagaki’s team found the lake while studying hydrothermal vents—undersea volcanic hot spots—in the East China Sea off the coast of Taiwan (map of Taiwan).

    The lake’s presence was unexpected, because the seamount lies only 4,600 feet (1400 meters) below sea level. At that depth, liquid CO2 is lighter than water and will slowly rise, eventually bubbling into the air as gas.

    There’s also a bit of video with a shellfish (shrimp) playing with a bubble of liquid CO2, so “acidification” from CO2 concentration clearly isn’t an issue for him…

    The simple fact is that asserting the CO2 rise is from fossil fuels is an assumption and a guess. It can’t be anything else as the needed data are missing. Yes, we release the CO2. What happens to it after that is anybodies guess and subject to gigantic natural processes that completely swamp it in scale.

    Per the paper:

    It would be nice to see the same treatment of “tide raising forces”. They have a cycle that matches the temperature history nicely. Tides are not just a monthly cycle. Since the moon has longer orbital changes, there are 60 year cycles (sound familiar?) and 1800 year cycles and several others.

    Tides account for more than half of the vertical ocean mixing, so can easily account for moving cold water to the surface. That, then, can shift CO2 absorption and air temperatures and influence rainfall. As orbital resonance will keep lunar changes ‘in sync’ with planetary positions and solar motions (and potentially solar sunspot state if the match of sunspots to solar motion is valid) the lunar-tidal link can also explain some of the apparent solar correlation. They correlate, but via common orbital mechanics and tides.

    http://www.appinsys.com/GLobalWarming/SixtyYearCycle.htm

    cites: http://www.agu.org/pubs/crossref/2012/2012GL052885.shtml

    We find that there is a significant oscillation with a period around 60-years in the majority of the tide gauges examined during the 20th Century, and that it appears in every ocean basin.

    with this nice graph:
    http://www.appinsys.com/GLobalWarming/SixtyYearCycle_files/image002.jpg

    http://www.pnas.org/content/97/8/3814.full

    We propose that variations in the strength of oceanic tides cause periodic cooling of surface ocean water by modulating the intensity of vertical mixing that brings to the surface colder water from below. The tides provide more than half of the total power for vertical mixing, 3.5 terawatts (4), compared with about 2.0 terawatts from wind drag (3), making this hypothesis plausible. Moreover, the tidal mixing process is strongly nonlinear, so that vertical mixing caused by tidal forcing must vary in intensity interannually even though the annual rate of power generation is constant (3). As a consequence, periodicities in strong forcing, that we will now characterize by identifying the peak forcing events of sequences of strong tides, may so strongly modulate vertical mixing and sea-surface temperature as to explain cyclical cooling even on the millennial time-scale.

    I would also suggest that the degree of ‘mixing’ will influence CO2 absorption rates.

    More detail here:
    http://chiefio.wordpress.com/2013/01/04/lunar-cycles-more-than-one/

    The tidal cycles match temperature history on the 60 year, 1800 year and other periods as well. IMHO, it’s a strong contender and possible “smoking gun” for natural variability. At present we are at a dead bottom of mixing. Going forward, we ought to be getting much more. That ought to give cooling ocean surfaces, less CO2 out gassing and more absorbing, and a generally colder aspect to temperatures.

    Graph of mixing power:
    http://www.pnas.org/content/97/8/3814/F1.large.jpg

    Peak in 1974, trough in 1990s, peak in 1787 (LIA), trough in 1920-30. etc etc.

    So maybe CO2 and cold / hot cycle together because the come from the same ocean pot and are subjected to the same pot stirring…

  142. Joe says:

    richard telford says:
    January 4, 2013 at 7:31 am
    Since there is nothing wrong with what you wrote, and I don’t say anything about asymmetrical reliability, I can only assume that you misunderstood what I wrote.

    Beenstock et al does not explored the Type II error rate of their method. Therefore when they find no relationship, how sure can we be that there is no relationship and not that the apparent absence of a relationship is because their method has little statistical power.

    —————————————————————

    That’s what i was trying to explain, Richard. There are two possible types of error when you’re testing for the presence of something.

    Your test might tell you that “something exists when it doesn’t, or it might tell you it doesn’t exist when it does. Those are type 1 (a false positive result) and type 2 (a false negative result) respectively. The chance of each type of error will not usually be the same for any given test.

    If you’re testing for “the absence of something” (as they are in this paper) then “NO CORRELATION” is the “something” that you’re looking for. So a type one error would mean finding “NO CORRELATION” when there is one (ie: falsely finding what you’re looking). A type 2 error in this case would be finding “CORRELATION” when one doesn’t actually exist.

    So, in this case, finding “NO CORRELATION” is a POSITIVE result from the test whereas, had they found “CORRELATION” that would have been a NEGATIVE test result. It takes a little getting your head round that “there’s nothing there” can be the “positive” result but it’s only really a matter of perspective – if you’re looking for some solid ground to build on, finding a hole is a “negative” result, if you’re looking for holes to turn into swimming pools then finding solid ground is a negative..

    Assuming for now that the method is appropriate to the data, that the data itself is reliable, and so on, the only error consideration then is “how likely was it to incorrectly give the POSITIVE result we obtained?” – or “what is the chance of a false positive?”.

    The chance of a false positive is given entirely by the type 1 error rate, so the type 2 rate is irrelevant.

  143. D Böehm says:

    Philip Shehan,

    Thank you for your chart, in which you have cherry-picked a short term trend artifact.

    As I have repeatedly explained, the longer the time span of the chart, the more accurate the long term rising trend. Here is a chart showing what you are doing.

    Rather than using a proper long term trend chart, you are cherry-picking a short time frame that supports your belief system. You may not even realize what you are doing, because your mind is already made up and closed air-tight. You believe that global temperatures are accelerating, so you cherry-pick a short time frame and say, “Aha!! Acceleration!”

    But it is not so. The planet is recovering from the LIA along the same long term trend line, and it does not matter whether CO2 is low or high. In other words, CO2 does not matter. It is irrelevant. Sorry about your ox.

  144. Bart says:

    DeWitt Payne says:
    January 4, 2013 at 3:17 pm

    “Also, since a lot of you believe that the global temperature hasn’t changed since 1998, how come the CO2 concentration is still going up?”

    I gave you the equation above, and explained how it can arise. It’s very simple, but apparently over your head.

  145. richardscourtney says:

    DeWitt Payne:

    I am acknowledging your post at January 4, 2013 at 3:17 pm so you know I have not ignored your twaddle.

    I suggested that you watch Salby’s lecture and check his facts. But your post says you have not. Instead you continue to adhere to your irrational prejudice.

    E.M.Smith gives detailed information at January 4, 2013 at 3:24 pm and he makes the only possibly valid conclusion when he says

    The simple fact is that asserting the CO2 rise is from fossil fuels is an assumption and a guess. It can’t be anything else as the needed data are missing. Yes, we release the CO2. What happens to it after that is anybodies guess and subject to gigantic natural processes that completely swamp it in scale.

    And our paper assessed the entire carbon cycle not merely its parts which he mentions so I could add to his comments, but I see no reason to bother when you have repeatedly expressed your prejudice so clearly.

    Be content with your prejudice if it makes you happy, but I will continue to adhere to the empirical evidence and it rejects your irrational beliefs.

    Richard

  146. richardscourtney says:

    DeWitt Payne:

    My reply to your post at January 4, 2013 at 3:17 pm did not reply to a specific question you asked but I had repeatedly answered in posts whose contents you have ignored.

    In retrospect, my failure to answer your question could imply that I have avoided it. The question was

    Also, since a lot of you believe that the global temperature hasn’t changed since 1998, how come the CO2 concentration is still going up?

    My shortest answer to that was in my post at January 4, 2013 at 6:21 am addressed to Davidmhoffer.

    Please note that you would not have asked your question if you were willing to learn from this thread instead of asserting your irrational beliefs.

    Richard

  147. Joe says:
    January 4, 2013 at 3:25 pm

    Your test might tell you that “something exists when it doesn’t, or it might tell you it doesn’t exist when it does. Those are type 1 (a false positive result) and type 2 (a false negative result) respectively. The chance of each type of error will not usually be the same for any given test.

    If you’re testing for “the absence of something” (as they are in this paper) then “NO CORRELATION” is the “something” that you’re looking for. So a type one error would mean finding “NO CORRELATION” when there is one (ie: falsely finding what you’re looking). A type 2 error in this case would be finding “CORRELATION” when one doesn’t actually exist.
    ————————–
    Please would you kindly give a reference in a statistical text book for this.

  148. D Böehm says:

    Philip Shehan,

    As usual you are dissembling, by avoiding the clear points I made. The chart I posted is no different in principle from the one you posted under it. They both show that there has been no acceleration of global warming — the central issue. No acceleration. What is it about “no acceleration” that you can’t get your head around? Global warming has not accelerated, despite your desperate cherry-picked artifacts. In fact, global warming has stopped for the past decade and a half. You seem to be the only jamoke who can’t understand that plain fact.

    And as I have pointed out before, your S.S. chart has no provenance; it is an invented fabrication with no connection to reality. No doubt a John Cook cartoon.

    Once more for the thick-headed: there is no acceleration in the long term global warming recovery from the LIA.

    Sorry about your ox. I never liked him anyway.

  149. JP Miller says:

    Philip Shehan says:
    January 4, 2013 at 4:42 pm

    That data set also shows ~5 ~32 year half-cycles. ~1848-1880 warmer, ~1880-1912 cooler, ~1912-1944 warmer, ~1944-1976 cooler, ~1976-2008 warmer, ~2008-?? cooler. sarc/

    Anyone care to “fit” some other pattern….

  150. D Böehm says:

    JP Miller,

    You’ve got to understand something about Shehan: he feeds at the public trough, so he feels he needs to spout his alarmist propaganda. But it’s only pseudo-science.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Connecting to %s