Misplaced Pages

Hockey stick graph (global temperature)

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.

A mean is a quantity representing the "center" of a collection of numbers and is intermediate to the extreme values of the set of numbers. There are several kinds of means (or "measures of central tendency ") in mathematics , especially in statistics . Each attempts to summarize or typify a given group of data , illustrating the magnitude and sign of the data set . Which of these measures is most illuminating depends on what is being measured, and on context and purpose.

#823176

125-448: Hockey stick graphs present the global or hemispherical mean temperature record of the past 500 to 2000 years as shown by quantitative climate reconstructions based on climate proxy records. These reconstructions have consistently shown a slow long term cooling trend changing into relatively rapid warming in the 20th century, with the instrumental temperature record by 2000 exceeding earlier temperatures. The term hockey stick graph

250-549: A Medieval Warm Period from around 900 to 1300, followed by Little Ice Age . This was the basis of a "schematic diagram" featured in the IPCC First Assessment Report of 1990 beside cautions that the medieval warming might not have been global. The use of indicators to get quantitative estimates of the temperature record of past centuries was developed, and by the late 1990s a number of competing teams of climatologists found indications that recent warming

375-425: A color wheel —there is no mean to the set of all colors. In these situations, you must decide which mean is most useful. You can do this by adjusting the values before averaging, or by using a specialized approach for the mean of circular quantities . The Fréchet mean gives a manner for determining the "center" of a mass distribution on a surface or, more generally, Riemannian manifold . Unlike many other means,

500-598: A principal component analysis step to summarise these proxy networks, but from 2001 Mann stopped using this method and introduced a multivariate Climate Field Reconstruction (CFR) technique based on the RegEM method which did not require this PCA step. In May 2002 Mann and Scott Rutherford published a paper on testing methods of climate reconstruction which discussed this technique. By adding artificial noise to actual temperature records or to model simulations they produced synthetic datasets which they called "pseudoproxies". When

625-479: A random variable having that distribution. If the random variable is denoted by X {\displaystyle X} , then the mean is also known as the expected value of X {\displaystyle X} (denoted E ( X ) {\displaystyle E(X)} ). For a discrete probability distribution , the mean is given by ∑ x P ( x ) {\displaystyle \textstyle \sum xP(x)} , where

750-400: A truncated mean . It involves discarding given parts of the data at the top or the bottom end, typically an equal amount at each end and then taking the arithmetic mean of the remaining data. The number of values removed is indicated as a percentage of the total number of values. The interquartile mean is a specific example of a truncated mean. It is simply the arithmetic mean after removing

875-500: A 1991 study and a 1991 book showing methodology and examples of how to produces maps showing climate developments in North America over time. These methods had been used for regional reconstructions of temperatures, and other aspects such as rainfall. As part of his PhD research, Michael E. Mann worked with seismologist Jeffrey Park on developing statistical techniques to find long term oscillations of natural variability in

1000-497: A climate model to generate a series of annual temperature maps for the world over the past several centuries. They then added white noise to the proxy data and applied the methods used in MBH98, a variation of principal component analysis , to the computed temperature maps and found that the amount of variation was considerably reduced. In April 2006, Science published a comment authored by Wahl and collaborators, asserting errors in

1125-836: A co-author on the paper, had confirmed agreement with these points, and a later paper by Cook, Esper and D'Arrigo reconsidered the earlier paper's conclusions along these lines. Lonnie Thompson published a paper on "Tropical Glacier and Ice Core Evidence of Climate Change" in January 2003, featuring Figure 7 showing graphs based on ice cores closely resembling a graph based on the MBH99 reconstruction, combined with thermometer readings from Jones et al. 1999. In March 2001 Tapio Schneider published his regularized expectation–maximization (RegEM) technique for analysis of incomplete climate data. The original MBH98 and MBH99 papers avoided undue representation of large numbers of tree ring proxies by using

1250-463: A comparison with the thermometer record to check that recent proxy data were valid. Jones thought the study would provide important comparisons with the findings of climate modeling , which showed a "pretty reasonable" fit to proxy evidence. A commentary on MBH98 by Jones was published in Science on 24 April 1998. He noted that it used almost all the available long term proxy climate series, "and if

1375-454: A decadal basis rather than showing individual years, and produced a single time series so did not show a spatial pattern of relative temperatures for different regions. The IPCC Second Assessment Report (SAR) of 1996 featured Figure 3.20 showing the Bradley & Jones 1993 decadal summer temperature reconstruction for the northern hemisphere, overlaid with a 50-year smoothed curve and with

SECTION 10

#1732793383824

1500-467: A focus of dispute for those opposed to the strengthening scientific consensus that late 20th century warmth was exceptional. In 2003, as lobbying over the 1997 Kyoto Protocol intensified, a paper claiming greater medieval warmth was quickly dismissed by scientists in the Soon and Baliunas controversy . Later in 2003, Stephen McIntyre and Ross McKitrick published McIntyre & McKitrick 2003b disputing

1625-417: A function f ( x ) {\displaystyle f(x)} . Intuitively, a mean of a function can be thought of as calculating the area under a section of a curve, and then dividing by the length of that section. This can be done crudely by counting squares on graph paper, or more precisely by integration . The integration formula is written as: In this case, care must be taken to make sure that

1750-552: A future ice age, but after 1976 he supported the emerging view that greenhouse gas emissions caused by humanity would cause detectable global warming "by about A.D. 2000". The first quantitative reconstruction of Northern Hemisphere (NH) annual mean temperatures was published in 1979 by Brian Groveman and Helmut Landsberg. They used "a short-cut method" based on their earlier paper which showed that 9 instrumental stations could adequately represent an extensive gridded instrumental series, and reconstructed temperatures from 1579 to 1880 on

1875-419: A high-resolution global reconstruction. To relate this data to measured temperatures, they used principal component analysis (PCA) to find the leading patterns, or principal components, of instrumental temperature records during the calibration period from 1902 to 1980. Their method was based on separate multiple regressions between each proxy record (or summary) and all of the leading principal components of

2000-430: A sample x 1 , x 2 , … , x n {\displaystyle x_{1},x_{2},\ldots ,x_{n}} , usually denoted by x ¯ {\displaystyle {\bar {x}}} , is the sum of the sampled values divided by the number of items in the sample. For example, the arithmetic mean of five values: 4, 36, 45, 50, 75 is: The geometric mean

2125-404: A separate curve plotting instrumental thermometer data from the 1850s onwards. It stated that in this record, warming since the late 19th century was unprecedented. The section proposed that "The data from the last 1000 years are the most useful for determining the scales of natural climate variability". Recent studies including the 1994 reconstruction by Hughes and Diaz questioned how widespread

2250-431: A simplified figure for the cover of the short annual World Meteorological Organization report, which lacks the status of the more important IPCC reports. Two fifty-year smoothed curves going back to 1000 were shown, from MBH99 and Jones et al. (1998), with a third curve to 1400 from Briffa's new paper, combined with modern temperature data bringing the lines up to 1999: in 2010 the lack of a clarity about this change of data

2375-547: A study of 1,000 years of tree ring data from Tasmania which, like similar studies, did not allow for possible overestimate of warming due to increased CO 2 levels having a fertilisation effect on tree growth. It noted the suggestion of Bradley et al. 1991 that instrumental records in specific areas could be combined with paleoclimate data for increased detail back to the 18th century. Archives of climate proxies were developed: in 1993 Raymond S. Bradley and Phil Jones composited historical records, tree-rings and ice cores for

2500-481: A temperature drop of almost 0.5 °C during the Little Ice Age , and increased solar output might explain the rise in early 20th century temperatures. A reconstruction of Arctic temperatures over four centuries by Overpeck et al. 1997 reached similar conclusions, but both these studies came up against the limitations of the climate reconstructions at that time which only resolved temperature fluctuations on

2625-836: Is a German climate scientist. He is a professor at the Meteorological Institute of the University of Hamburg , and (since 2001) Director of the Institute for Coastal Research at the Helmholtz Research Centre (previously: GKSS Research Center) in Geesthacht , Germany . He is a member of the advisory boards of the journal Journal of Climate . He worked at the Max Planck Institute for Meteorology from 1986 to 1995 and headed

SECTION 20

#1732793383824

2750-404: Is a certain amount of skill. We can actually say something, although there are large uncertainties." In considering the 1998 Jones et al. reconstruction which went back a thousand years, Mann, Bradley and Hughes reviewed their own research and reexamined 24 proxy records which extended back before 1400. Mann carried out a series of statistical sensitivity tests , removing each proxy in turn to see

2875-401: Is an average that is useful for sets of positive numbers, that are interpreted according to their product (as is the case with rates of growth) and not their sum (as is the case with the arithmetic mean): For example, the geometric mean of five values: 4, 36, 45, 50, 75 is: The harmonic mean is an average which is useful for sets of numbers which are defined in relation to some unit , as in

3000-422: Is below and half is above. The mode income is the most likely income and favors the larger number of people with lower incomes. While the median and mode are often more intuitive measures for such skewed data, many skewed distributions are in fact best described by their mean, including the exponential and Poisson distributions. The mean of a probability distribution is the long-run arithmetic average value of

3125-654: Is quite a bit of work to be done in reducing these uncertainties." Climatologist Tom Wigley welcomed the progress made in the study, but doubted if proxy data could ever be wholly convincing in detecting the human contribution to changing climate. Phil Jones of the UEA Climatic Research Unit told the New York Times he was doubtful about adding the 150-year thermometer record to extend the proxy reconstruction, and compared this with putting together apples and oranges; Mann et al. said they used

3250-492: Is the probability density function . In all cases, including those in which the distribution is neither discrete nor continuous, the mean is the Lebesgue integral of the random variable with respect to its probability measure . The mean need not exist or be finite; for some probability distributions the mean is infinite ( +∞ or −∞ ), while for others the mean is undefined . The generalized mean , also known as

3375-413: Is the arithmetic average of the values; however, for skewed distributions , the mean is not necessarily the same as the middle value (median), or the most likely value (mode). For example, mean income is typically skewed upwards by a small number of people with very large incomes, so that the majority have an income lower than the mean. By contrast, the median income is the level at which half the population

3500-480: Is the sum of the values divided by the number of values. The arithmetic mean of a set of numbers x 1 , x 2 , ..., x n is typically denoted using an overhead bar , x ¯ {\displaystyle {\bar {x}}} . If the numbers are from observing a sample of a larger group , the arithmetic mean is termed the sample mean ( x ¯ {\displaystyle {\bar {x}}} ) to distinguish it from

3625-537: The IPCC Third Assessment Report (TAR) included a subsection on multi-proxy synthesis of recent temperature change. This noted five earlier large-scale palaeoclimate reconstructions, then discussed the Mann, Bradley & Hughes 1998 reconstruction going back to 1400 AD and its extension back to 1000 AD in Mann, Bradley & Hughes 1999 (MBH99), while emphasising the substantial uncertainties in

3750-506: The National Oceanic and Atmospheric Administration 's Geophysical Fluid Dynamics Laboratory , Jerry Mahlman nicknamed the graph the "hockey stick", with the slow cooling trend the "stick", and the anomalous 20th century warming the "blade". Briffa and Tim Osborn critically examined MBH99 in a May 1999 detailed study of the uncertainties of various proxies. They raised questions later adopted by critics of Mann's work, including

3875-693: The Synthesis Report - Questions . The Working Group 1 scientific basis report was agreed unanimously by all member government representatives in January 2001 at a meeting held in Shanghai , China. A large poster of the IPCC illustration based on the MBH99 graph formed the backdrop when Sir John T. Houghton , as co-chair of the working group, presented the report in an announcement shown on television, leading to wide publicity. The Huang, Pollack & Shen 2000 borehole temperature reconstruction covering

Hockey stick graph (global temperature) - Misplaced Pages Continue

4000-400: The arithmetic mean (AM), the geometric mean (GM), and the harmonic mean (HM). These means were studied with proportions by Pythagoreans and later generations of Greek mathematicians because of their importance in geometry and music. The arithmetic mean (or simply mean or average ) of a list of numbers, is the sum of all of the numbers divided by their count. Similarly, the mean of

4125-441: The group mean (or expected value ) of the underlying distribution, denoted μ {\displaystyle \mu } or μ x {\displaystyle \mu _{x}} . Outside probability and statistics, a wide range of other notions of mean are often used in geometry and mathematical analysis ; examples are given below. In mathematics, the three classical Pythagorean means are

4250-535: The instrumental temperature record of global surface temperatures over the last 140 years; Mann & Park 1993 showed patterns relating to the El Niño–Southern Oscillation , and Mann & Park 1994 found what was later termed the Atlantic multidecadal oscillation . They then teamed up with Raymond S. Bradley to use these techniques on the dataset from his Bradley & Jones 1993 study with

4375-534: The polar regions and the tropics , they used principal component analysis (PCA) to produce PC summaries representing these large datasets, and then treated each summary as a proxy record in their CFR analysis. Networks represented in this way included the North American tree ring network (NOAMER) and Eurasia . The primary aim of CFR methods was to provide the spatially resolved reconstructions essential for coherent geophysical understanding of how parts of

4500-403: The "Composite Plus Scaling" (CPS) method which was subsequently used by most large-scale climate reconstructions of hemispheric or global average temperatures. In this method, also known as "Composite Plus Scale", selected climate proxy records were standardized before being averaged (composited), and then centred and scaled to provide a quantitative estimate of the target temperature series for

4625-401: The "Medieval Climatic Optimum" from the "late tenth to early thirteenth centuries (about AD 950-1250)", followed by a cooler period of the Little Ice Age which ended only in the middle to late nineteenth century. The report discussed the difficulties with proxy data, "mainly pollen remains, lake varves and ocean sediments, insect and animal remains, glacier termini" but considered tree ring data

4750-408: The 1950s) down to a cooler Little Ice Age before rising sharply in the 20th century. Thermometer data shown with a dotted line overlapped the reconstruction for a calibration period from 1902 to 1980, then continued sharply up to 1998. A shaded area showed uncertainties to two standard error limits, in medieval times rising almost as high as recent temperatures. When Mann gave a talk about the study to

4875-451: The 1960s that accurate use of tree rings as climate proxies for reconstructions was pioneered by Harold C. Fritts . In 1965 Hubert Lamb , a pioneer of historical climatology , generalised from temperature records of central England by using historical, botanical and archeological evidence to popularise the idea of a Medieval Warm Period from around 900 to 1300, followed by a cold epoch culminating between 1550 and 1700. In 1972 he became

5000-523: The 1988 formation of the Intergovernmental Panel on Climate Change to produce reports subject to detailed approval by government delegates. The IPCC First Assessment Report in 1990 noted evidence that Holocene climatic optimum around 5,000-6,000 years ago had been warmer than the present (at least in summer) and that in some areas there had been exceptional warmth during "a shorter Medieval Warm Period (which may not have been global)",

5125-491: The 1990s was the warmest decade and 1998 the warmest year" in the past 1,000 years. Versions of these graphs also featured less prominently in the short Synthesis Report Summary for Policymakers , which included a sentence stating that "The increase in surface temperature over the 20th century for the Northern Hemisphere is likely to have been greater than that for any other century in the last thousand years", and

Hockey stick graph (global temperature) - Misplaced Pages Continue

5250-461: The 19th century physicists John Tyndall and Svante Arrhenius who found the greenhouse gas effect of carbon dioxide (CO 2 ) in the atmosphere to explain how past ice ages had ended. From 1919 to 1923, Alfred Wegener did pioneering work on reconstructing the climate of past eras in collaboration with Milutin Milanković , publishing Die Klimate der geologischen Vorzeit ("The Climates of

5375-585: The 2004 paper, stating that "their conclusion was based on incorrect implementation of the reconstruction procedure" a mistake with Repercussions; and a disputing VS Reply. In this reply, VS and his team demonstrated that caveats raised in the Wahl comment did not invalidate their original conclusion. The inadequacy of the MBH98 methodology for climate reconstructions was later independently confirmed in other publications, for instance by Lee, Zwiers and Tsao, 2008 or by Christiansen et al., 2009. In 2010, Storch received

5500-444: The 20th century. After this around a decade elapsed before Gordon Jacoby and Rosanne D'Arrigo produced the next quantitative NH reconstruction, published in 1989. This was the first based entirely on non-instrumental records, and used tree rings. They reconstructed northern hemisphere annual temperatures since 1671 on the basis of boreal North American tree ring data from 11 distinct regions. From this, they concluded that recent warming

5625-512: The Asian monsoon region, the El Niño–Southern Oscillation region and the Atlantic region. Areas where more data was needed were to be identified, and there was a need for improved data exchange with computer-based archiving and translation to give researchers access to worldwide paleoclimate information. The IPCC supplementary report, 1992 , reviewed progress on various proxies. These included

5750-511: The Fréchet mean is defined on a space whose elements cannot necessarily be added together or multiplied by scalars. It is sometimes also known as the Karcher mean (named after Hermann Karcher). In geometry, there are thousands of different definitions for the center of a triangle that can all be interpreted as the mean of a triangular set of points in the plane. This is an approximation to

5875-465: The Geological Past") together with Wladimir Köppen, in 1924. In the 1930s Guy Stewart Callendar compiled temperature records to look for changes. Wilmot H. Bradley showed that annual varves in lake beds showed climate cycles, and A. E. Douglass found that tree rings could track past climatic changes but these were thought to only show random variations in the local region. It was only in

6000-524: The German Waldsterben ( Forest dieback ) hype of the 1980s: On 20 June 2013 Storch stated "So far, no one has been able to provide a compelling answer to why climate change seems to be taking a break. We're facing a puzzle. Recent CO 2 emissions have actually risen even more steeply than we feared. As a result, according to most climate models, we should have seen temperatures rise by around 0.25 degrees Celsius (0.45 degrees Fahrenheit) over

6125-629: The IMSC achievement award at the International Meetings on Statistical Climatology in Edinburgh , to "recognize his key contributions to statistical downscaling, reconstruction of temperature series, analyses of climatic variability, and detection and attribution of climate change". In 1977, Hans von Storch co-founded a 100-member Donald Duck Club, defending Donald Duck against accusations of indecent behavior. Between 1976 and 1985 he

6250-417: The MBH98 methodology to extend their study back to 1000. A version of the MBH99 graph was featured prominently in the 2001 IPCC Third Assessment Report (TAR), which also drew on Jones et al. 1998 and three other reconstructions to support the conclusion that, in the Northern Hemisphere, the 1990s was likely to have been the warmest decade and 1998 the warmest year during the past 1,000 years. The graph became

6375-618: The McIntyre and McKitrick methodology. Political disputes led to the formation of a panel of scientists convened by the United States National Research Council , their North Report in 2006 supported Mann's findings with some qualifications, including agreeing that there were some statistical failings but these had little effect on the result. More than two dozen reconstructions , using various statistical methods and combinations of proxy records, support

SECTION 50

#1732793383824

6500-471: The Medieval Warm Period had been at any one time, thus it was not possible "to conclude that global temperatures in the Medieval Warm Period were comparable to the warm decades of the late 20th century." The SAR concluded, "it appears that the 20th century has been at least as warm as any century since at least 1400 AD. In at least some areas, the recent period appears to be warmer than has been

6625-477: The Northern Hemisphere from 1400 up to the 1970s to produce a decadal reconstruction. Like later reconstructions including the MBH "hockey stick" studies, the Bradley & Jones 1993 reconstruction indicated a slow cooling trend followed by an exceptional temperature rise in the 20th century. Their study also used the modern instrumental temperature record to evaluate how well the regions covered by proxies represented

6750-597: The Statistical Analysis and Modelling research group there. Storch said in testimony to the U.S. House of Representatives in 2006 that anthropogenic climate change exists: He is also known for an article in Der Spiegel he co-wrote with Nico Stehr , which states that: In December 2009, he expressed concern about the credibility of science and criticized some publicly visible scientists for simplifying and dramatizing their communications. He pointed to

6875-519: The University of East Anglia (UEA) had "violated a fundamental principle of science" by refusing to share data with other researchers. "They play science as a power game," he said. In 2003, with effect from 1 August, Hans von Storch was appointed as editor-in-chief of the journal Climate Research , after having been on its editorial board since 1994. A few months before a controversial article ( Soon and Baliunas 2003 ) had raised questions about

7000-529: The aim of finding long term oscillations of natural variability in global climate. The resulting reconstruction went back to 1400, and was published in November as Mann, Park & Bradley 1995 . They were able to detect that the multiple proxies were varying in a coherent oscillatory way, indicating both the multidecadal pattern in the North Atlantic and a longer term oscillation of roughly 250 years in

7125-467: The aspect that got the most attention. Their original draft ended in 1980 as most reconstructions only went that far, but an anonymous peer reviewer of the paper suggested that the curve of instrumental temperature records should be shown up to the present to include the considerable warming that had taken place between 1980 and 1998. The Mann, Bradley & Hughes 1998 (MBH98) multiproxy study on "Global-scale temperature patterns and climate forcing over

7250-478: The back of an envelope", a "rather dodgy bit of hand-waving". In Bradley 1991 , a working group of climatologists including Raymond S. Bradley , Malcolm K. Hughes , Jean Jouzel , Wibjörn Karlén , Jonathan Overpeck and Tom Wigley proposed a project to improve understanding of natural climatic variations over the last two thousand years so that their effect could be allowed for when evaluating human contributions to climate change. Climate proxy temperature data

7375-463: The basis of their compilation of 20 time-series. These records were largely instrumental but also included some proxy records including two tree-ring series. Their method used nested multiple regression to allow for records covering different periods, and produced measures of uncertainty. The reconstruction showed a cool period extending beyond the Maunder Minimum , and warmer temperatures in

7500-468: The borehole reconstruction published by Pollack, Huang and Shen gave independent support to the conclusion that 20th century warmth was exceptional for the past 500 years. Jones, Keith Briffa , Tim P. Barnett and Simon Tett had independently produced a "Composite Plus Scale" (CPS) reconstruction extending back for a thousand years, comparing tree ring, coral layer, and glacial proxy records, but not specifically estimating uncertainties. Jones et al. 1998

7625-573: The broad consensus shown in the original 1998 hockey-stick graph, with variations in how flat the pre-20th century "shaft" appears. The 2007 IPCC Fourth Assessment Report cited 14 reconstructions, 10 of which covered 1,000 years or longer, to support its strengthened conclusion that it was likely that Northern Hemisphere temperatures during the 20th century were the highest in at least the past 1,300 years. Further reconstructions, including Mann et al. 2008 and PAGES 2k Consortium 2013 , have supported these general conclusions. Paleoclimatology influenced

SECTION 60

#1732793383824

7750-716: The case for a thousand or more years". Tim Barnett of the Scripps Institution of Oceanography was working towards the next IPCC assessment with Phil Jones , and in 1996 told journalist Fred Pearce "What we hope is that the current patterns of temperature change prove distinctive, quite different from the patterns of natural variability in the past". A divergence problem affecting some tree ring proxies after 1960 had been identified in Alaska by Taubes 1995 and Jacoby & d'Arrigo 1995 . Tree ring specialist Keith Briffa 's February 1998 study showed that this problem

7875-440: The case of speed (i.e., distance per unit of time): For example, the harmonic mean of the five values: 4, 36, 45, 50, 75 is If we have five pumps that can empty a tank of a certain size in respectively 4, 36, 45, 50, and 75 minutes, then the harmonic mean of 15 {\displaystyle 15} tells us that these five different pumps working together will pump at the same rate as much as five pumps that can each empty

8000-470: The claim that all of the warming took place between 1920 and 1935, before increased human greenhouse gas emissions. The George C. Marshall Institute alleged that MBH98 was deceptive in only going back to 1400, and so not covering the Medieval Warm Period which predated industrial greenhouse gas emissions. The same criticisms were made by Willie Soon and Sallie Baliunas . In October 1998

8125-465: The climate of the region or hemisphere over time. This method was implemented in various ways, including different selection processes for the proxy records, and averaging could be unweighted, or could be weighted in relation to an assessment of reliability or of area represented. There were also different ways of finding the scaling coefficient used to scale the proxy records to the instrumental temperature record. John A. Eddy had earlier tried to relate

8250-428: The climate system varied and responded to radiative forcing , so hemispheric averages were a secondary product. The CFR method could also be used to reconstruct Northern Hemisphere mean temperatures, and the results closely resembled the earlier CPS reconstructions including Bradley & Jones 1993 . Mann describes this as the least scientifically interesting thing they could do with the rich spatial patterns, but also

8375-640: The conclusion that peak Medieval warmth only occurred during two or three short periods of 20 to 30 years, with temperatures around 1950s levels, refuting claims that 20th century warming was not unusual. An analysis by Crowley published in July 2000 compared simulations from an energy balance climate model with reconstructed mean annual temperatures from MBH99 and Crowley & Lowery (2000). While earlier reconstructed temperature variations were consistent with volcanic and solar irradiation changes plus residual variability, very large 20th-century warming closely agreed with

8500-480: The controversy, the publisher of Climate Research upgraded Hans von Storch from editor to editor in chief, but von Storch decided that the Soon and Baliunas paper was seriously flawed and should not have been published as it was. He proposed a new editorial system, and though the publisher of Climate Research agreed that the paper should not have been published uncorrected, he rejected von Storch's proposals to improve

8625-524: The data used in MBH98 paper. In 2004 Hans von Storch published criticism of the statistical techniques as tending to underplay variations in earlier parts of the graph, though this was disputed and he later accepted that the effect was very small. In 2005 McIntyre and McKitrick published criticisms of the principal components analysis methodology as used in MBH98 and MBH99. Their analysis was subsequently disputed by published papers including Huybers 2005 and Wahl & Ammann 2007 which pointed to errors in

8750-483: The debate, and Broecker's criticism that MBH99 did not show a clear MWP. They concluded that the MWP was likely to have been widespread in the extratropical northern hemisphere, and seemed to have approached late 20th century temperatures at times. In an interview, Mann said the study did not contradict MBH as it dealt only with extratropical land areas, and stopped before the late 20th century. He reported that Edward R. Cook ,

8875-561: The dominant climate forcing during the 20th century. In a review in the same issue, Gabriele C. Hegerl described their method as "quite original and promising", which could help to verify model estimates of natural climate fluctuations and was "an important step towards reconstructing space–time records of historical temperature patterns". Release of the paper on 22 April 1998 was given exceptional media coverage, including questioning as to whether it proved that human influences were responsible for climate change . Mann would only agree that it

9000-406: The earlier period. The MBH99 conclusion that the 1990s were likely to have been the warmest decade, and 1998 the warmest year, of the past millennium in the Northern Hemisphere, with "likely" defined as "66-90% chance", was supported by reconstructions by Crowley & Lowery 2000 and by Jones et al. 1998 using different data and methods. The Pollack, Huang & Shen 1998 reconstruction covering

9125-588: The editorial process, and von Storch with three other board members resigned. Senator James M. Inhofe stated his belief that "manmade global warming is the greatest hoax ever perpetrated on the American people", and a hearing of the United States Senate Committee on Environment and Public Works which he convened on 29 July 2003 heard the news of the resignations. Mean The arithmetic mean , also known as "arithmetic average",

9250-700: The effect its removal had on the result. He found that certain proxies were critical to the reliability of the reconstruction, particularly one tree ring dataset collected by Gordon Jacoby and Rosanne D'Arrigo in a part of North America Bradley's earlier research had identified as a key region. This dataset only extended back to 1400, and though another proxy dataset from the same region (in the International Tree-Ring Data Bank ) went further back and should have given reliable proxies for earlier periods, validation tests only supported their reconstruction after 1400. To find out why, Mann compared

9375-463: The first eigenvector -based climate field reconstruction (CFR). This showed global patterns of annual surface temperature, and included a graph of average hemispheric temperatures back to 1400 with shading emphasising that uncertainties (to two standard error limits) were much greater in earlier centuries. Jones et al. 1998 independently produced a CPS reconstruction extending back for a thousand years, and Mann, Bradley & Hughes 1999 (MBH99) used

9500-566: The founding director of the Climatic Research Unit (CRU) in the University of East Anglia (UEA), which aimed to improve knowledge of climate history in both the recent and far distant past, monitor current changes in global climate, identify processes causing changes at different timescales, and review the possibility of advising about future trends in climate. During the cold years of the 1960s, Lamb had anticipated that natural cycles were likely to lead over thousands of years to

9625-422: The globe, including a rich resource of tree ring networks for some areas and sparser proxies such as lake sediments, ice cores and corals, as well as some historical records. Their global reconstruction was a major breakthrough in evaluation of past climate dynamics, and the first eigenvector -based climate field reconstruction (CFR) incorporating multiple climate proxy data sets of different types and lengths into

9750-457: The important step of validation calculations , which showed that the reconstructions were statistically meaningful, or skillful . A balance was required over the whole globe, but most of the proxy data came from tree rings in the Northern mid latitudes , largely in dense proxy networks. Since using all of the large numbers of tree ring records in would have overwhelmed the sparse proxies from

9875-436: The instrumental record from the proxy evidence and emphasising the increasing range of possible error in earlier times, which MBH said would "preclude, as yet, any definitive conclusions" about climate before 1400. The reconstruction found significant variability around a long-term cooling trend of –0.02 °C per century, as expected from orbital forcing , interrupted in the 20th century by rapid warming which stood out from

10000-484: The instrumental record. The least squares simultaneous solution of these multiple regressions used covariance between the proxy records. The results were then used to reconstruct large-scale patterns over time in the spatial field of interest (defined as the empirical orthogonal functions , or EOFs) using both local relationships of the proxies to climate and distant climate teleconnections . Temperature records for almost 50 years prior to 1902 were analysed using PCA for

10125-443: The integral converges. But the mean may be finite even if the function itself tends to infinity at some points. Angles , times of day, and other cyclical quantities require modular arithmetic to add and otherwise combine numbers. In all these situations, there will not be a unique mean. For example, the times an hour before and after midnight are equidistant to both midnight and noon. It is also possible that no mean exists. Consider

10250-466: The journal Science which tested multiproxy methods such as those used by Mann, Bradley, and Hughes, 1998, often called MBH98, or Mann and Jones , to obtain the global temperature variations in the past 1000 years . The test suggested that the method used in MBH98 would inherently underestimate large variations had they occurred; but this was subsequently challenged: see hockey stick graph for more detail. To reach this conclusion, Storch et al. used

10375-410: The journal's decentralised review process, with no editor-in-chief, and about the editorial policy of one editor, Chris de Freitas . Storch drafted and circulated an editorial on the new regime, reserving the right as editor-in-chief to reject articles proposed for acceptance by one of the editors. Following the publisher's refusal to publish the editorial unless all editors serving on the board endorsed

10500-529: The last 1,000 years. In the Soon and Baliunas controversy , two scientists cited in the papers said that their work was misrepresented, and the Climate Research paper was criticised by many other scientists, including several of the journal's editors. On 8 July Eos featured a detailed rebuttal of both papers by 13 scientists including Mann and Jones, presenting strong evidence that Soon and Baliunas had used improper statistical methods. Responding to

10625-411: The lowest and the highest quarter of values. assuming the values have been ordered, so is simply a specific example of a weighted mean for a specific set of weights. In some circumstances, mathematicians may calculate a mean of an infinite (or even an uncountable ) set of values. This can happen when calculating the mean value y avg {\displaystyle y_{\text{avg}}} of

10750-422: The mean and size of sample i {\displaystyle i} respectively. In other applications, they represent a measure for the reliability of the influence upon the mean by the respective values. Sometimes, a set of numbers might contain outliers (i.e., data values which are much lower or much higher than the others). Often, outliers are erroneous data caused by artifacts . In this case, one can use

10875-447: The mean for a moderately skewed distribution. It is used in hydrocarbon exploration and is defined as: where P 10 {\textstyle P_{10}} , P 50 {\textstyle P_{50}} and P 90 {\textstyle P_{90}} are the 10th, 50th and 90th percentiles of the distribution, respectively. Hans von Storch Hans von Storch (born 13 August 1949)

11000-424: The millennium, with 1998 the warmest year so far." Bradley was quoted as saying "Temperatures in the latter half of the 20th century were unprecedented", while Mann said "As you go back farther in time, the data becomes sketchier. One can't quite pin things down as well, but, our results do reveal that significant changes have occurred, and temperatures in the latter 20th century have been exceptionally warm compared to

11125-399: The new multivariate method of relating these series to the instrumental data is as good as the paper claims, it should be statistically reliable." He discussed some of the difficulties, and emphasised that "Each paleoclimatic discipline has to come to terms with its own limitations and must unreservedly admit to problems, warts and all." The study was disputed by contrarian Pat Michaels with

11250-465: The new policy, Storch resigned four days before he was due to take up his new position. Four other editors later left the journal. Storch later told the Chronicle of Higher Education that " climate science skeptics " “had identified Climate Research as a journal where some editors were not as rigorous in the review process as is otherwise common.” In late 2004, Storch's team published an article in

11375-499: The northern hemisphere average, and compared the instrumental record with the proxy reconstruction over the same period. It concluded that the "Little Ice Age" period was complex, with evidence suggesting the influence of volcanic eruptions. It showed that temperatures since the 1920s were higher than earlier in the 500-year period, an indication of other factors which could most probably be attributed to human caused changes increasing levels of greenhouse gases . This paper introduced

11500-517: The parameter m , the following types of means are obtained: This can be generalized further as the generalized f -mean and again a suitable choice of an invertible f will give The weighted arithmetic mean (or weighted average) is used if one wants to combine average values from different sized samples of the same population: Where x i ¯ {\displaystyle {\bar {x_{i}}}} and w i {\displaystyle w_{i}} are

11625-518: The past 10 years. That hasn't happened. In fact, the increase over the last 15 years was just 0.06 degrees Celsius (0.11 degrees Fahrenheit) -- a value very close to zero. This is a serious scientific problem that the Intergovernmental Panel on Climate Change (IPCC) will have to confront when it presents its next Assessment Report late next year." Hans von Storch, who also concurs with the mainstream view on global warming, said that

11750-428: The past 500 years gave independent support for this conclusion, which was compared against the independent (extra-tropical, warm-season) tree-ring density NH temperature reconstruction of Briffa 2000 . Its Figure 2.21 showed smoothed curves from the MBH99, Jones et al. and Briffa reconstructions, together with modern thermometer data as a red line and the grey shaded 95% confidence range from MBH99. Above it, figure 2.20

11875-428: The past five centuries supported the conclusion that 20th century warming was exceptional. In a perspective commenting on MBH99, Wallace Smith Broecker argued that the Medieval Warm Period (MWP) was global. He attributed recent warming to a roughly 1500-year cycle which he suggested related to episodic changes in the Atlantic's conveyor circulation . A March 2002 tree ring reconstruction by Jan Esper et al. noted

12000-407: The past millennium: inferences, uncertainties, and limitations to emphasise the increasing uncertainty involved in reconstructions of the period before 1400 when fewer proxies were available. A University of Massachusetts Amherst news release dated 3 March 1999 announced publication in the 15 March issue of Geophysical Research Letters , "strongly suggesting that the 1990s were the warmest decade of

12125-532: The past six centuries" was submitted to the journal Nature on 9 May 1997, accepted on 27 February 1998 and published on 23 April 1998. The paper announced a new statistical approach to find patterns of climate change in both time and global distribution, building on previous multiproxy reconstructions. The authors concluded that "Northern Hemisphere mean annual temperatures for three of the past eight years are warmer than any other year since (at least) AD1400", and estimated empirically that greenhouse gases had become

12250-477: The point that bristlecone pines from the Western U.S. could have been affected by pollution such as rising CO 2 levels as well as temperature. The temperature curve was supported by other studies, but most of these shared the limited well dated proxy evidence then available, and so few were truly independent. The uncertainties in earlier times rose as high as those in the reconstruction at 1980, but did not reach

12375-511: The power mean or Hölder mean, is an abstraction of the quadratic , arithmetic, geometric, and harmonic means. It is defined for a set of n positive numbers x i by x ¯ ( m ) = ( 1 n ∑ i = 1 n x i m ) 1 m {\displaystyle {\bar {x}}(m)=\left({\frac {1}{n}}\sum _{i=1}^{n}x_{i}^{m}\right)^{\frac {1}{m}}} By choosing different values for

12500-415: The preceding 900 years. Though substantial uncertainties exist in the estimates, these are nonetheless startling revelations." While the reconstruction supported theories of a relatively warm medieval period, Hughes said "even the warmer intervals in the reconstruction pale in comparison with mid-to-late 20th-century temperatures." The New York Times report had a colored version of the graph, distinguishing

12625-422: The predicted effects of greenhouse gas emissions. Reviewing twenty years of progress in palaeoclimatology, Jones noted the reconstructions by Jones et al. (1998), MBH99, Briffa (2000) and Crowley & Lowery (2000) showing good agreement using different methods, but cautioned that use of many of the same proxy series meant that they were not independent, and more work was needed. The Working Group 1 (WG1) part of

12750-426: The rarity of sunspots during the Maunder Minimum to Lamb's estimates of past climate, but had insufficient information to produce a quantitative assessment. The problem was reexamined by Bradley in collaboration with solar physicists Judith Lean and Juerg Beer , using the findings of Bradley & Jones 1993 . The Lean, Beer & Bradley 1995 paper confirmed that the drop in solar output appeared to have caused

12875-457: The reconstruction procedure was used with these pseudoproxies, the result was then compared with the original record or simulation to see how closely it had been reconstructed. The paper discussed the issue that regression methods of reconstruction tended to underestimate the amplitude of variation. While the IPCC Third Assessment Report (TAR) drew on five reconstructions to support its conclusion that recent Northern Hemisphere temperatures were

13000-503: The reconstruction useful for investigating natural variability and long-term oscillations as well as for comparisons with patterns produced by climate models. The CFR method made more use of climate information embedded in remote proxies, but was more dependent than CPS on assumptions that relationships between proxy indicators and large-scale climate patterns remained stable over time. Related rigorous statistical methods had been developed for tree ring data, with Harold C. Fritts publishing

13125-431: The reconstructions have been taken up by fossil fuel industry funded lobbying groups attempting to cast doubt on climate science. Paleoclimatology dates back to the 19th century, and the concept of examining varves in lake beds and tree rings to track local climatic changes was suggested in the 1930s. In the 1960s, Hubert Lamb generalised from historical documents and temperature records of central England to propose

13250-565: The region to produce a corrected version of this dataset. Their reconstruction using this corrected dataset passed the validation tests for the extended period, but they were cautious about the increased uncertainties. The Mann, Bradley and Hughes reconstruction covering 1,000 years (MBH99) was submitted in October 1998 to Geophysical Research Letters which published it in March 1999 with the cautious title Northern Hemisphere temperatures during

13375-467: The sum is taken over all possible values of the random variable and P ( x ) {\displaystyle P(x)} is the probability mass function . For a continuous distribution , the mean is ∫ − ∞ ∞ x f ( x ) d x {\displaystyle \textstyle \int _{-\infty }^{\infty }xf(x)\,dx} , where f ( x ) {\displaystyle f(x)}

13500-544: The surrounding region. Their study did not calibrate these proxy patterns against a quantitative temperature scale, and a new statistical approach was needed to find how they related to surface temperatures in order to reconstruct past temperature patterns. For his postdoctoral research Mann joined Bradley and tree ring specialist Malcolm K. Hughes to develop a new statistical approach to reconstruct underlying spatial patterns of temperature variation combining diverse datasets of proxy information covering different periods across

13625-419: The tank in 15 {\displaystyle 15} minutes. AM, GM, and HM satisfy these inequalities: Equality holds if all the elements of the given sample are equal. In descriptive statistics , the mean may be confused with the median , mode or mid-range , as any of these may incorrectly be called an "average" (more formally, a measure of central tendency ). The mean of a set of observations

13750-497: The temperatures of later thermometer data. They concluded that although the 20th century was almost certainly the warmest of the millennium, the amount of anthropogenic warming remains uncertain." With work progressing on the next IPCC report, Chris Folland told researchers on 22 September 1999 that a figure showing temperature changes over the millennium "is a clear favourite for the policy makers' summary". Two graphs competed: Jones et al. (1998) and MBH99. In November, Jones produced

13875-526: The two datasets and found that they tracked each other closely from 1400 to 1800, then diverged until around 1900 when they again tracked each other. He found a likely reason in the CO 2 " fertilisation effect " affecting tree rings as identified by Graybill and Idso, with the effect ending once CO 2 levels had increased to the point where warmth again became the key factor controlling tree growth at high altitude. Mann used comparisons with other tree ring data from

14000-405: The warmest in the past 1,000 years, it gave particular prominence to an IPCC illustration based on the MBH99 paper. The hockey stick graph was subsequently seen by mass media and the public as central to the IPCC case for global warming, which had actually been based on other unrelated evidence. From an expert viewpoint the graph was, like all newly published science, preliminary and uncertain, but it

14125-433: The whole period, with the 1990s "the warmest decade, and 1998 the warmest year, at moderately high levels of confidence." This was illustrated by the time series line graph Figure 2(a) which showed their reconstruction from AD 1000 to 1980 as a thin line, wavering around a thicker dark 40-year smoothed line. This curve followed a downward trend (shown as a thin dot-dashed line) from a Medieval Warm Period (about as warm as

14250-421: Was criticised as misleading . Briffa's paper as published in the January 2000 issue of Quaternary Science Reviews showed the unusual warmth of the last century, but cautioned that the impact of human activities on tree growth made it subtly difficult to isolate a clear climate message. In February 2000 Thomas J. Crowley and Thomas S. Lowery 's reconstruction incorporated data not used previously. It reached

14375-423: Was "highly suggestive" of that inference. He said that "Our conclusion was that the warming of the past few decades appears to be closely tied to emission of greenhouse gases by humans and not any of the natural factors". Most proxy data are inherently imprecise, and Mann said "We do have error bars. They are somewhat sizable as one gets farther back in time, and there is reasonable uncertainty in any given year. There

14500-409: Was "not yet sufficiently easy to assess nor sufficiently integrated with indications from other data to be used in this report." A "schematic diagram" of global temperature variations over the last thousand years has been traced to a graph based loosely on Lamb's 1965 paper, nominally representing central England, modified by Lamb in 1982. Mike Hulme describes this schematic diagram as "Lamb's sketch on

14625-546: Was adapted from MBH99. Figure 5 in WG1 Technical Summary B (as shown to the right) repeated this figure without the linear trend line declining from AD 1000 to 1850. This iconic graph adapted from MBH99 was featured prominently in the WG1 Summary for Policymakers under a graph of the instrumental temperature record for the past 140 years. The text stated that it was "likely that, in the Northern Hemisphere,

14750-465: Was anomalous over the 300-year period, and went as far as speculating that these results supported the hypothesis that recent warming had human causes. Publicity over the concerns of scientists about the implications of global warming led to increasing public and political interest, and the Reagan administration , concerned in part about the political impact of scientific findings, successfully lobbied for

14875-494: Was exceptional. Bradley & Jones 1993 introduced the "Composite Plus Scaling" (CPS) method which, as of 2009, was still being used by most large-scale reconstructions. Their study was featured in the IPCC Second Assessment Report of 1995. In 1998 Michael E. Mann , Raymond S. Bradley and Malcolm K. Hughes developed new statistical techniques to produce Mann, Bradley & Hughes 1998 (MBH98),

15000-445: Was more widespread at high northern latitudes, and warned that it had to be taken into account to avoid overestimating past temperatures. Variations on the "Composite Plus Scale" (CPS) method continued to be used to produce hemispheric or global mean temperature reconstructions. From 1998 this was complemented by Climate Field Reconstruction (CFR) methods which could show how climate patterns had developed over large spatial areas, making

15125-474: Was needed at seasonal or annual resolution covering a wide geographical area to provide a framework for testing the part climate forcings had played in past variations, look for cycles in climate, and find if debated climatic events such as the Little Ice Age and Medieval Warm Period were global. Reconstructions were to be made of key climate systems, starting with three climatically sensitive regions:

15250-522: Was not unusual. In March they published an extended paper in Energy & Environment , with additional authors. The Bush administration 's Council on Environmental Quality chief of staff Philip Cooney inserted references to the papers in the draft first Environmental Protection Agency Report on the Environment , and removed all references to reconstructions showing world temperatures rising over

15375-483: Was popularized by the climatologist Jerry Mahlman , to describe the pattern shown by the Mann, Bradley & Hughes 1999 (MBH99) reconstruction, envisaging a graph that is relatively flat with a downward trend to 1900 as forming an ice hockey stick 's "shaft" followed by a sharp, steady increase corresponding to the "blade" portion. The reconstructions have featured in Intergovernmental Panel on Climate Change (IPCC) reports as evidence of global warming . Arguments over

15500-552: Was submitted to The Holocene on 16 October 1997; their revised manuscript was accepted on 3 February and published in May 1998. As Bradley recalls, Mann's initial view was that there was too little information and too much uncertainty to go back so far, but Bradley said "Why don't we try to use the same approach we used in Nature, and see if we could push it back a bit further?" Within a few weeks, Mann responded that to his surprise, "There

15625-496: Was widely used to publicise the issue of global warming, and it was targeted by those opposing ratification of the Kyoto Protocol on global warming. A literature review by Willie Soon and Sallie Baliunas , published in the relatively obscure journal Climate Research on 31 January 2003, used data from previous papers to argue that the Medieval Warm Period had been warmer than the 20th century, and that recent warming

#823176