Skip to content
  • Email

Curiosity

Never enough knowledge

  • Home
    • About
    • Site usage and privacy
  • Topics
    • Aviation
    • Chichester Pubs
    • Climate Change
      • Measuring Global Temperatures
      • Factors causing temperature increase
      • Climate Modelling
      • UK Net Zero Target
    • Marquetry
    • Statistics
    • Typewriters
  • Publications
    • Effective Document and Data Management
    • Information Governance: Beyond ISO 30301
    • Information Modelling – for business and beyond
  • Recent posts
  • Toggle search form

Climate Modelling

Climate models are computer programs that perform mathematical simulations of the climate system involving the land, atmosphere, ocean, and ice-covered regions around the globe.

In the simplest terms modelling involves the following steps:

Source Climate Information

  1. Measure the temperature above the land and ocean across thousands of locations around the planet
  2. Compare these temperatures with what is normal for each location – the difference being called ‘anomalies’. The ‘normal’ level for each location is the long-term average for that area over a base period
  3. Having divided the earth (land and sea) into 3D grid boxes calculate the average temperature for each box by combining data from all the available measurement sources
  4. Take the average of all the temperature anomalies and compare with previous years to determine changes in global warming

In practice the procedure is anything but simple. The 3D grid cells must be fed with climate-related physical information about a particular location and mathematical equations devised to represent the key physical, chemical, geological and biological processes. The equations are then translated into computer code ensuring that elements of the model can talk to each other across climate components (such as the atmosphere cloud and ocean) and between grid cells. The model is then run simulating what happens in each grid cell at one time and then stepwise through time – the shorter the time of each period, or the smaller the grids the more computer power is needed. To validate the model the outcomes are compared with observational data and modellers can then adjust (‘tune’) parameter to provide a better match with reality.

So there are specific issues that need to be confronted:

  • The smaller the grid size, the greater the computational power needed, so at some stage assumptions have to be made about what happens at sub-grid level such as the influence of clouds on the passage of heat.
  • Grids are not uniform in shape and become elongated and thinner – more like pizzas – the further they are from the earth’s surface. Sub-grid assumptions may need to be made to cater for moist convection (atmospheric convection in which the phase changes of water play an appreciable role) through these thinner grids.
2 Clouds in GCMs: Representing sub-grid heterogeneity Many of the observed clouds and especially the processes within them are of subgrid-scale size (both horizontally and vertically) GCM Grid cell 10-300km. For more informstion on sub-grid cloud cover go here
  • In order to start a model run, the simulation at step one needs to be initialised by specifying the condition of the atmosphere and ocean with parameters such as temperature, salinity, sea level and wind speed. But these can change from one time period to another so outcomes will change depending on the initial parameters chosen. So initialisation relies more on using the gross features of the climate system using long-term averages of a variable.
  • Once a model configuration is fixed, it has to be ‘tuned‘ which consists of choosing parameter values in such a way that a certain measure of the deviation of the model output from selected observations or theory is minimized or reduced to an acceptable range. The trouble is by adjusting parameters modellers also compensate, intentionally or not, for some (often unknown) deficiencies in the model formulation itself.

Given that no one model can be ‘correct’, different models from research groups around the world are used to create an ‘ensemble’ under the Coupled Model Inter-comparison Project (CMIP) with CMIP6 providing the basis for AR6 – the Sixth Assessment Report of the IPCC – where the results are ‘averaged’. A video providing a short Introduction to Climate Models – CMIP & CMIP6 is available at the World Climate Research Programme (WCRM).

The WCRM is provided with scientific guidance by the Joint Scientific Committee (JSC) comprising scientists drawn from the World Meteorological Organization (WMO), the International Science Council (ISC) and the Intergovernmental Oceanographic Commission (IOC). The WCRM manages two main activities – Core Projects and Lighthouse activities and it is within the Core Projects that the CMIP sits under the Earth System Modelling and Observations (ESMO).

Much more informatiion about all the main organisational elements of the WCRM is available here where there are links to the respective websites on the square grids attached to each concept.

Organisation of the Coupled Model Inter-comparison Project (CMIP)

Those wishing to be involved in CMIP are provided with a Guide for Modelers which takes them through the various stages from registration to selection of model inter-comparison projects, running the model and storing the output for access by others.

For more information a concept map depicting all the main project stages of CMIP is available here where the respective websites can be reached by clicking on the square grids on each concept.

Views: 0

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Copyright © 2023 Curiosity.

Powered by PressBook WordPress theme

Manage Cookie Consent
We use cookies to optimize our website and our service.
Functional Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes. The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.
Manage options Manage services Manage {vendor_count} vendors Read more about these purposes
Preferences
{title} {title} {title}