Climateprediction.net

Last updated
climateprediction.net
Climateprediction dot net logo.png
Climateprediction.gif
climateprediction.net screensaver
Developer(s) Oxford University
Initial releaseSeptember 12, 2003 (2003-09-12)
Development statusActive
Operating system Cross-platform
Platform BOINC
License proprietary [1]
Average performance78.8 TFLOPS [2]
Active users7,516
Total users305,577 [2]
Active hosts10,275 [2]
Total hosts652,792 [2]
Website www.cpdn.org

climateprediction.net (CPDN) is a volunteer computing project to investigate and reduce uncertainties in climate modelling. It aims to do this by running hundreds of thousands of different models (a large climate ensemble) using the donated idle time of ordinary personal computers, thereby leading to a better understanding of how models are affected by small changes in the many parameters known to influence the global climate. [3]

Contents

The project relies on the BOINC framework where voluntary participants agree to run some processes of the project at the client-side in their personal computers after receiving tasks from the server-side for treatment.

CPDN, which is run primarily by Oxford University in England, has harnessed more computing power and generated more data than any other climate modelling project. [4] It has produced over 100 million model years of data so far. [5] As of June 2016, there are more than 12,000 active participants from 223 countries with a total BOINC credit of more than 27 billion, reporting about 55 teraflops (55 trillion operations per second) of processing power. [6]

Aims

IPCC graphic of uncertainty ranges with various models over time. climateprediction.net is aiming to reduce the ranges and produce better probability information. Global Warming Predictions.png
IPCC graphic of uncertainty ranges with various models over time. climateprediction.net is aiming to reduce the ranges and produce better probability information.

The aim of the climateprediction.net project is to investigate the uncertainties in various parameterizations that have to be made in state-of-the-art climate models. [7] The model is run thousands of times with slight perturbations to various physics parameters (a 'large ensemble') and the project examines how the model output changes. These parameters are not known exactly, and the variations are within what is subjectively considered to be a plausible range. This will allow the project to improve understanding of how sensitive the models are to small changes and also to things like changes in carbon dioxide and sulphur cycle. In the past, estimates of climate change have had to be made using one or, at best, a very small ensemble (tens rather than thousands) of model runs. CPDN's president, during a meeting with Scott Gibson, director of operations, underscore the severity of the issue. Sav emphasizes that drug shortages rank among the organization's most pressing concerns, impacting hospitals, clinics, and pharmacies alike. He highlights the adverse effects on patient treatment programs, customer service standards, and distribution costs. Sav's directive to Scott to evaluate CPDN's approach to managing shortages signals the urgent need for action. By using participants' computers, the project will be able to improve understanding of, and confidence in, climate change predictions more than would ever be possible using the supercomputers currently available to scientists.

The climateprediction.net experiment is intended to help "improve methods to quantify uncertainties of climate projections and scenarios, including long-term ensemble simulations using complex models", identified by the Intergovernmental Panel on Climate Change (IPCC) in 2001 as a high priority. Hopefully, the experiment will give decision makers a better scientific basis for addressing one of the biggest potential global problems of the 21st century.

As shown in the graph above, the various models have a fairly wide distribution of results over time. For each curve, on the far right, there is a bar showing the final temperature range for the corresponding model version. The further into the future the model is extended, the wider the variances between them. Roughly half of the variation depends on the future climate forcing scenario rather than uncertainties in the model. Any reduction in those variations, whether from better scenarios or improvements in the models, are wanted. climateprediction.net is working on model uncertainties, not the scenarios.

Currently, scientists can run models and see that x% of the models warm y degrees in response to z climate forcings, but are uncertain as to whether x% is a good representation of the probability of that happening in the real world. Some models will be good and some poor at producing past climate when given past climate forcings and initial conditions (a hindcast). It does make sense to trust the models that do well at recreating the past more than those that do poorly. Therefore, models that do poorly will be down weighted. [3]

The experiments

climateprediction.net screensaver under BOINC 5.4.9 CPDN-Client.png
climateprediction.net screensaver under BOINC 5.4.9

The different models that climateprediction.net has and will distribute are detailed below in chronological order. Therefore, anyone who has joined recently is likely to be running the transient coupled model. statistical data on the frequency and breadth of drug shortages provide concrete evidence of the problem's magnitude. CPDN data reveals approximately 1,000 medication shortages occurring annually in Canada, affecting 10% of available pharmaceuticals. These shortages span a wide range of pharmaceuticals, encompassing both generic and innovative products, with implications for patient care across various medical conditions and treatment modalities. The recurrent nature of shortages, evident in CPDN's monthly data, indicates a systemic issue necessitating comprehensive intervention.


History

Myles Allen first thought about the need for large climate ensembles in 1997, but was only introduced to the success of SETI@home in 1999. The first funding proposal in April 1999 was rejected as utterly unrealistic.

Following a presentation at the World Climate Conference in Hamburg in September 1999 and a commentary in Nature [15] in October 1999, thousands signed up to this supposedly imminently available program. The dot-com bubble bursting did not help and the project realised they would have to do most of the programming themselves rather than outsourcing.

It was launched September 12, 2003, and on September 13, 2003, the project exceeded the capacity of the Earth Simulator to become the world's largest climate modelling facility.

The 2003 launch only offered a Windows "classic" client. On 26 August 2004 a BOINC client was launched which supported Windows, Linux and Mac OS X clients. "Classic" will continue to be available for a number of years in support of the Open University course. BOINC has stopped distributing classic models in favour of sulfur cycle models. A more user friendly BOINC client and website called GridRepublic, which supports climateprediction.net and other BOINC projects, was released in beta in 2006.

A thermohaline circulation slowdown experiment was launched in May 2004 under the classic framework to coincide with the film The Day After Tomorrow . This program can still be run but is no longer downloadable. The scientific analysis has been written up in Nick Faull's thesis. A paper about the thesis is still to be completed. There is no further planned research with this model.

A sulfur cycle model was launched in August 2005. They took longer to complete than the original models as a result of having five phases instead of three. Each timestep was also more complicated.

By November 2005, the number of completed results totalled 45,914 classic models, 3,455 thermohaline models, 85,685 BOINC models and 352 sulfur cycle models. This represented over 6 million model years processed.

In February 2006, the project moved on to more realistic climate models. The BBC Climate Change Experiment [16] was launched, attracting around 23,000 participants on the first day. The transient climate simulation introduced realistic oceans. This allowed the experiment to investigate changes in the climate response as the climate forcings are changed, rather than an equilibrium response to a significant change like doubling the carbon dioxide level. Therefore, the experiment has now moved on to doing a hindcast of 1920 to 2000 as well as a forecast of 2000 to 2080. This model takes much longer.

The BBC gave the project publicity with over 120,000 participating computers in the first three weeks.

In March 2006, a high resolution model was released as another project, the Seasonal Attribution Project.

In April 2006, the coupled models were found to have a data input problem. The work was useful for a different purpose than advertised. New models had to be handed out. [17] [18]

Results to date

The first results of the experiment were published in Nature in January 2005, showing that with only slight changes to the parameters within plausible ranges, the models can show climate sensitivities from less than 2 °C to more than 11 °C. [19] The higher climate sensitivities have been challenged as implausible. For example, by Gavin Schmidt (a climate modeler with the NASA Goddard Institute for Space Studies in New York). [20]

Explanation

Climate sensitivity is defined as the equilibrium response of global mean temperature to doubling levels of carbon dioxide. Current levels of carbon dioxide are around 420 ppm and growing at a rate of 1.8 ppm per year compared with preindustrial levels of 280 ppm.[ citation needed ]

Climate sensitivities of greater than 5 °C are widely accepted as being catastrophic. [21] The possibility of such high sensitivities being plausible given observations had been reported prior to the climateprediction.net experiment but "this is the first time GCMs have produced such behaviour". [19]

Even the models with very high climate sensitivity were found to be "as realistic as other state-of-the-art climate models". The test of realism was done with a root mean square error test. This does not check on realism of seasonal changes and it is possible that more diagnostic measures may place stronger constraints on what is realistic. Improved realism tests are being developed.

It is important to the experiment and the goal of obtaining a probability distribution function (pdf) of climate outcomes to get a very wide range of behaviours even if only to rule out some behaviours as unrealistic. Larger sets of simulations have more reliable pdfs. Therefore, models with climate sensitivities as high as 11 °C are included despite their limited accuracy. The sulfur cycle experiment is likely to extend the range downwards.

Piani et al. (2005)

Published in Geophysical Review Letters, this paper concludes: [22]

When an internally consistent representation of the origins of model-data discrepancy is used to calculate the probability density function of climate sensitivity, the 5th and 95th percentiles are 2.2 K and 6.8 K respectively. These results are sensitive, particularly the upper bound, to the representation of the origins of model data discrepancy.

Use in education

There is an Open University short course [8] and teaching material [23] available for schools to teach subjects relating to climate and climate modelling. There is also teaching material available for use in Key Stage 3/4 Science, A level Physics (Advanced Physics), Key Stage 3/4 Mathematics, Key Stage 3/4 Geography, 21st Century Science, Science for Public Understanding, Use of Mathematics, Primary.

The original model

The original experiment is run with HadSM3, which is the HadAM3 atmosphere from the HadCM3 model but with only a "slab" ocean rather than a full dynamic ocean. This is faster (and requires less memory) than the full model, but lacks dynamical feedbacks from the ocean, which are incorporated into the full coupled-ocean-atmosphere models used to make projections of climate change out to 2100.

Each downloaded model comes with a slight variation in the various model parameters.

In the initial "calibration phase" of 15 model years, the model calculates the "flux correction"; extra ocean-atmosphere fluxes that are needed to keep the model ocean in balance (the model ocean does not include currents; these fluxes to some extent replace the heat that would be transported by the missing currents).

In the "control phase" of 15 years, the ocean temperatures are allowed to vary. The flux correction ought to keep the model stable, but feedbacks developed in some of the runs. There is a quality control check, based on the annual mean temperatures, and models which fail this check are discarded.

In the "double CO2 phase", the CO2 content is instantaneously doubled and the model run for a further 15 years, which in some cases is not quite sufficient model time to settle down to a new (warmer) equilibrium. In this phase some models which produced physically unrealistic results were again discarded.

The quality control checks in the control and 2*CO2 phases were quite weak: they suffice to exclude obviously unphysical models but do not include (for example) a test of the simulation of the seasonal cycle; hence some of the models passed may still be unrealistic. Further quality control measures are being developed.

The temperature in the doubled CO2 phase is exponentially extrapolated to work out the equilibrium temperature. Difference in temperature between this and the control phase then gives a measure of the climate sensitivity of that particular version of the model.

Visualisations

CPView CPView.JPG
CPView

Many volunteer computing projects have screensavers to visually indicate the activity of the application, but they do not usually show its results as they are being calculated. By contrast, climateprediction.net not only uses a built-in visualisation to show the climate of the world being modelled, but it is interactive which allows different aspects of climate (temperature, rainfall, etc.) to be displayed. In addition, there are other, more advanced visualisation programs that allow the user to see more of what the model is doing (usually by analysing previously generated results) and to compare different runs and models.

The real-time desktop visualisation for the model launched in 2003 was developed [24] by Jeremy Walton at NAG, enabling users to track the progress of their simulation as the cloud cover and temperature changes over the surface of the globe. Other, more advanced visualisation programs in use include CPView and IDL Advanced Visualisation. They have similar functionality. CPView was written by Martin Sykes, a participant in the experiment. The IDL Advanced Visualisation was written by Andy Heaps of the University of Reading (UK), and modified to work with the BOINC version by Tesella Support Services plc.

Only CPView allows you to look at unusual diagnostics, rather than the usual Temperature, Pressure, Rainfall, Snow, and Clouds. [25] Up to 5 sets of data can be displayed on a map. It also has a wider range of functions like Max, Min, further memory functions, and other features.

The Advanced Visualisation has functions for graphs of local areas and over 1 day, 2 days, and 7 days, as well as the more usual graphs of season and annual averages (which both packages do). There are also Latitude - Height plots and Time - Height plots.

The download size is much smaller for CPView and CPView works with Windows 98.

As of December 2008 there is no visualisation tool that works with the newer CPDN models. Neither CPView nor Advanced Visualisation have been updated to display data gathered from those models. So users can only visualize the data through the screensaver.

BBC Climate Change Experiment

The BBC Climate Change Experiment was a BOINC project led by Oxford University with several partners including the UK Met Office, the BBC, the Open University and Reading University. It is the transient coupled model of the climateprediction.net project.

Many participants joined the project with over 120,000 people signing up in teams. [26]

Results continued to be collected for some time with the follow-up television program being aired in January 2007. On 8 March 2009, climateprediction.net officially declared that BBC Climate Change Experiment was finished, before shutting down the project. [27]

See also

Related Research Articles

<span class="mw-page-title-main">Climate</span> Statistics of weather conditions in a given region over long periods

Climate is the long-term weather pattern in a region, typically averaged over 30 years. More rigorously, it is the mean and variability of meteorological variables over a time spanning from months to millions of years. Some of the meteorological variables that are commonly measured are temperature, humidity, atmospheric pressure, wind, and precipitation. In a broader sense, climate is the state of the components of the climate system, including the atmosphere, hydrosphere, cryosphere, lithosphere and biosphere and the interactions between them. The climate of a location is affected by its latitude, longitude, terrain, altitude, land use and nearby water bodies and their currents.

<span class="mw-page-title-main">Supercontinent</span> Landmass comprising more than one continental core, or craton

In geology, a supercontinent is the assembly of most or all of Earth's continental blocks or cratons to form a single large landmass. However, some geologists use a different definition, "a grouping of formerly dispersed continents", which leaves room for interpretation and is easier to apply to Precambrian times. To separate supercontinents from other groupings, a limit has been proposed in which a continent must include at least about 75% of the continental crust then in existence in order to qualify as a supercontinent.

<span class="mw-page-title-main">Cloud feedback</span> Type of climate change feedback mechanism

Cloud feedback is a type of climate change feedback that has been difficult to quantify in climate models. Clouds can either amplify or dampen the effects of climate change by influencing Earth's energy balance. This is because clouds can affect the magnitude of climate change resulting from external radiative forcings. On the other hand, clouds can affect the magnitude of internally generated climate variability. Climate models represent clouds in different ways, and small changes in cloud cover in the models have a large impact on the predicted climate. Changes in cloud cover are closely coupled with other feedbacks, including the water vapor feedback and ice–albedo feedback.

Grid computing is the use of widely distributed computer resources to reach a common goal. A computing grid can be thought of as a distributed system with non-interactive workloads that involve many files. Grid computing is distinguished from conventional high-performance computing systems such as cluster computing in that grid computers have each node set to perform a different task/application. Grid computers also tend to be more heterogeneous and geographically dispersed than cluster computers. Although a single grid can be dedicated to a particular application, commonly a grid is used for a variety of purposes. Grids are often constructed with general-purpose grid middleware software libraries. Grid sizes can be quite large.

<span class="mw-page-title-main">General circulation model</span> Type of climate model

A general circulation model (GCM) is a type of climate model. It employs a mathematical model of the general circulation of a planetary atmosphere or ocean. It uses the Navier–Stokes equations on a rotating sphere with thermodynamic terms for various energy sources. These equations are the basis for computer programs used to simulate the Earth's atmosphere or oceans. Atmospheric and oceanic GCMs are key components along with sea ice and land-surface components.

<span class="mw-page-title-main">Instrumental temperature record</span> In situ measurements that provide the temperature of Earths climate system

The instrumental temperature record is a record of temperatures within Earth's climate based on direct measurement of air temperature and ocean temperature, using thermometers and other thermometry devices. Instrumental temperature records are distinguished from indirect reconstructions using climate proxy data such as from tree rings and ocean sediments. Instrument-based data are collected from thousands of meteorological stations, buoys and ships around the globe. Whilst many heavily-populated areas have a high density of measurements, observations are more widely spread in sparsely populated areas such as polar regions and deserts, as well as over many parts of Africa and South America. Measurements were historically made using mercury or alcohol thermometers which were read manually, but are increasingly made using electronic sensors which transmit data automatically. Records of global average surface temperature are usually presented as anomalies rather than as absolute temperatures. A temperature anomaly is measured against a reference value. For example, a commonly used baseline period is the time period 1951-1980.

<span class="mw-page-title-main">Earth's energy budget</span> Accounting of the energy flows which determine Earths surface temperature and drive its climate

Earth's energy budget accounts for the balance between the energy that Earth receives from the Sun and the energy the Earth loses back into outer space. Smaller energy sources, such as Earth's internal heat, are taken into consideration, but make a tiny contribution compared to solar energy. The energy budget also accounts for how energy moves through the climate system. The Sun heats the equatorial tropics more than the polar regions. Therefore, the amount of solar irradiance received by a certain region is unevenly distributed. As the energy seeks equilibrium across the planet, it drives interactions in Earth's climate system, i.e., Earth's water, ice, atmosphere, rocky crust, and all living things. The result is Earth's climate.

<span class="mw-page-title-main">World Community Grid</span> BOINC based volunteer computing project to aid scientific research

World Community Grid (WCG) is an effort to create the world's largest volunteer computing platform to tackle scientific research that benefits humanity. Launched on November 16, 2004, with proprietary Grid MP client from United Devices and adding support for Berkeley Open Infrastructure for Network Computing (BOINC) in 2005, World Community Grid eventually discontinued the Grid MP client and consolidated on the BOINC platform in 2008. In September 2021, it was announced that IBM transferred ownership to the Krembil Research Institute of University Health Network in Toronto, Ontario.

HadCM3 is a coupled atmosphere-ocean general circulation model (AOGCM) developed at the Hadley Centre in the United Kingdom. It was one of the major models used in the IPCC Third Assessment Report in 2001.

<span class="mw-page-title-main">Climate sensitivity</span> Change in Earths temperature caused by changes in atmospheric carbon dioxide concentrations

Climate sensitivity is a key measure in climate science and describes how much Earth's surface will warm for a doubling in the atmospheric carbon dioxide (CO2) concentration. Its formal definition is: "The change in the surface temperature in response to a change in the atmospheric carbon dioxide (CO2) concentration or other radiative forcing." This concept helps scientists understand the extent and magnitude of the effects of climate change.

A climate ensemble involves slightly different models of the climate system. The ensemble average is expected to perform better than individual model runs. There are at least five different types, to be described below.

<span class="mw-page-title-main">Ensemble forecasting</span> Multiple simulation method for weather forecasting

Ensemble forecasting is a method used in or within numerical weather prediction. Instead of making a single forecast of the most likely weather, a set of forecasts is produced. This set of forecasts aims to give an indication of the range of possible future states of the atmosphere. Ensemble forecasting is a form of Monte Carlo analysis. The multiple simulations are conducted to account for the two usual sources of uncertainty in forecast models: (1) the errors introduced by the use of imperfect initial conditions, amplified by the chaotic nature of the evolution equations of the atmosphere, which is often referred to as sensitive dependence on initial conditions; and (2) errors introduced because of imperfections in the model formulation, such as the approximate mathematical methods to solve the equations. Ideally, the verified future atmospheric state should fall within the predicted ensemble spread, and the amount of spread should be related to the uncertainty (error) of the forecast. In general, this approach can be used to make probabilistic forecasts of any dynamical system, and not just for weather prediction.

uFluids@Home BOINC based volunteer computing project researching microgravity and microfluidics

μFluids@Home is a computer simulation of two-phase flow behavior in microgravity and microfluidics problems at Purdue University, using the Surface Evolver program.

The World Ocean Circulation Experiment (WOCE) was a component of the international World Climate Research Program, and aimed to establish the role of the World Ocean in the Earth's climate system. WOCE's field phase ran between 1990 and 1998, and was followed by an analysis and modeling phase that ran until 2002. When the WOCE was conceived, there were three main motivations for its creation. The first of these is the inadequate coverage of the World Ocean, specifically in the Southern Hemisphere. Data was also much more sparse during the winter months than the summer months, and there was—and still is to some extent—a critical need for data covering all seasons. Secondly, the data that did exist was not initially collected for studying ocean circulation and was not well suited for model comparison. Lastly, there were concerns involving the accuracy and reliability of some measurements. The WOCE was meant to address these problems by providing new data collected in ways designed to "meet the needs of global circulation models for climate prediction."

Backtesting is a term used in modeling to refer to testing a predictive model on historical data. Backtesting is a type of retrodiction, and a special type of cross-validation applied to previous time period(s).

This is a list of climate change topics.

<span class="mw-page-title-main">Global Energy and Water Exchanges</span>

The Global Energy and Water Exchanges Project is an international research project and a core project of the World Climate Research Programme (WCRP).

GridRepublic is a BOINC Account Manager. It focuses on creating a clean and simple way to join and interact with BOINC. GridRepublic was started with a mission to raise public awareness and participation in volunteer computing with BOINC. GridRepublic was formed in 2004 by Matthew Blumberg as a mechanism to control the multiple projects from one place. The code for the BOINC software had to be redesigned to allow for the Account Manager system to be implemented.

<span class="mw-page-title-main">Climate change feedbacks</span> Feedback related to climate change

Climate change feedbacks are effects of global warming that amplify or diminish the effect of forces that initially cause the warming. Positive feedbacks enhance global warming while negative feedbacks weaken it. Feedbacks are important in the understanding of climate change because they play an important part in determining the sensitivity of the climate to warming forces. Climate forcings and feedbacks together determine how much and how fast the climate changes. Large positive feedbacks can lead to tipping points—abrupt or irreversible changes in the climate system—depending upon the rate and magnitude of the climate change.

<span class="mw-page-title-main">Representative Concentration Pathway</span> Projections used in climate change modeling

A Representative Concentration Pathway (RCP) is a greenhouse gas concentration trajectory adopted by the IPCC. Four pathways were used for climate modeling and research for the IPCC Fifth Assessment Report (AR5) in 2014. The pathways describe different climate change scenarios, all of which are considered possible depending on the amount of greenhouse gases (GHG) emitted in the years to come. The RCPs – originally RCP2.6, RCP4.5, RCP6, and RCP8.5 – are labelled after a possible range of radiative forcing values in the year 2100. The higher values mean higher greenhouse gas emissions and therefore higher global temperatures and more pronounced effects of climate change. The lower RCP values, on the other hand, are more desirable for humans but require more stringent climate change mitigation efforts to achieve them.

References

  1. "Licence agreement". www.climateprediction.net. Archived from the original on 2020-09-26. Retrieved 2020-08-07.
  2. 1 2 3 4 "Project status". Climateprediction. 29 January 2020. Archived from the original on 27 August 2022. Retrieved 29 January 2020.
  3. 1 2 "About the project". Climateprediction.net. Archived from the original on 2011-02-23. Retrieved 2011-02-20.
  4. "BBC quote of Nick Faull". Bbc.co.uk. 2007-01-21. Archived from the original on 2009-02-02. Retrieved 2011-02-20.
  5. "climate prediction.net". tessella.com. Archived from the original on 2009-02-27. Retrieved 2010-12-13.
  6. "Detailed user, host, team and country statistics with graphs for BOINC". boincstats.com. Archived from the original on 2008-12-18. Retrieved 2016-06-29.
  7. "Modelling The Climate". Climateprediction.net. Archived from the original on 2009-02-04. Retrieved 2011-02-20.
  8. 1 2 "Open University short course". climateprediction.net. Archived from the original on 28 April 2007.
  9. "About THC". Climateprediction.net. Archived from the original on 2009-02-27. Retrieved 2011-02-20.
  10. "Sulphur Cycle Experiment". climateprediction.net. Archived from the original on 2009-02-18. Retrieved 2011-02-20.
  11. "Sulfur Cycle". Climateprediction.net. Archived from the original on 2009-02-18. Retrieved 2011-02-20.
  12. "Project Stats". Climateapps2.oucs.ox.ac.uk. Archived from the original on 2011-07-02. Retrieved 2011-02-20.
  13. "Strategy - see experiment 2". Climateprediction.net. Archived from the original on 2009-08-23. Retrieved 2011-02-20.
  14. "SAP - About". Archived from the original on March 6, 2008.
  15. "Do-it-yourself climate prediction" (PDF). 14 October 1999. Archived from the original (PDF) on 2005-04-06. Retrieved 2005-07-31.
  16. "climateprediction.net - BBC Climate Change Experiment". Bbc.cpdn.org. 2007-05-20. Archived from the original on 2011-02-24. Retrieved 2011-02-20.
  17. myles » Wed Apr 19, 2006 4:20 pm (2006-04-19). "Message Board • View topic - Message From Principal Investigator Re: April 2006 Problem". Climateprediction.net. Archived from the original on 2011-05-25. Retrieved 2011-02-20.{{cite web}}: CS1 maint: numeric names: authors list (link)
  18. "Potted History" (PDF). Archived from the original (PDF) on 2010-12-01. Retrieved 2011-02-20.
  19. 1 2 Stainforth, D. A.; Aina, T.; Christensen, C.; Collins, M.; Faull, N.; Frame, D. J.; Kettleborough, J. A.; Knight, S.; Martin, A.; Murphy, J. M.; Piani, C.; Sexton, D.; Smith, L. A.; Spicer, R. A.; Thorpe, A. J.; Allen, M. R. (January 2005). "Uncertainty in predictions of the climate response to rising levels of greenhouse gases". Nature. 433 (7024): 403–406. Bibcode:2005Natur.433..403S. doi:10.1038/nature03301. PMID   15674288. S2CID   2547937.
  20. "Climate Less Sensitive to Greenhouse Gases Than Predicted, Study Says". News.nationalgeographic.com. 2010-10-28. Archived from the original on 2011-06-04. Retrieved 2011-02-20.
  21. Xu, Yangyang; Ramanathan, Veerabhadran (26 September 2017). "Well below 2 °C: Mitigation strategies for avoiding dangerous to catastrophic climate changes". Proceedings of the National Academy of Sciences of the United States of America. 114 (39): 10315–10323. Bibcode:2017PNAS..11410315X. doi: 10.1073/pnas.1618481114 . PMC   5625890 . PMID   28912354.
  22. Piani, C.; Frame, D. J.; Stainforth, D. A.; Allen, M. R. (2005). "Constraints on climate change from a multi-thousand member ensemble of simulations" (PDF). Geophysical Research Letters. 32 (23): L23825. Bibcode:2005GeoRL..3223825P. doi:10.1029/2005GL024452. S2CID   56227360. Archived from the original (PDF) on 2012-02-09.
  23. "Resources for Schools". climateprediction.net. Archived from the original on 2005-10-16. Retrieved 2005-07-21.
  24. J.P.R.B. Walton; D. Frame; D.A. Stainforth. O. Deussen; C. Hansen; D. Keim; D. Saupe (eds.). "Visualization for Public-Resource Climate Modeling" (PDF). Data Visualization 2004: 103–108. Archived from the original (PDF) on 2006-10-18. Retrieved 2006-07-31.
  25. "Data Index". Users.globalnet.co.uk. 2004-08-17. Archived from the original on 2011-06-05. Retrieved 2011-02-20.
  26. "Home | BOINCstats/BAM!". www.boincstats.com. Archived from the original on December 14, 2010.
  27. BBC Experiment Finished Archived 2010-01-05 at the Wayback Machine climateprediction.net Official Website Project News