Misplaced Pages

Nested Grid Model

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.

Numerical weather prediction ( NWP ) uses mathematical models of the atmosphere and oceans to predict the weather based on current weather conditions. Though first attempted in the 1920s, it was not until the advent of computer simulation in the 1950s that numerical weather predictions produced realistic results. A number of global and regional forecast models are run in different countries worldwide, using current weather observations relayed from radiosondes , weather satellites and other observing systems as inputs.

#888111

122-652: The Nested Grid Model (usually known as NGM for short) was a numerical weather prediction model run by the National Centers for Environmental Prediction , a division of the National Weather Service , in the United States. The NGM was, as its name suggested, derived from two levels of grids: a hemispheric-scale grid and a synoptic-scale grid, the latter of which had a resolution of approximately 90 kilometers. Its most notable feature

244-410: A prognostic step that solves the initial value problem . He also identified seven variables that defined the state of the atmosphere at a given point: pressure , temperature , density , humidity , and the three components of the flow velocity vector. Bjerknes pointed out that equations based on mass continuity , conservation of momentum , the first and second laws of thermodynamics , and

366-723: A 3D-Var data assimilation scheme in mid-1999. The Canadian Meteorological Centre has been running a global model since 1991. The United States ran the Nested Grid Model (NGM) from 1987 to 2000, with some features lasting as late as 2009. Between 2000 and 2002, the Environmental Modeling Center ran the Aviation (AVN) model for shorter range forecasts and the Medium Range Forecast (MRF) model at longer time ranges. During this time,

488-463: A 6-hour forecast for the state of the atmosphere over two points in central Europe, taking at least six weeks to do so. His forecast calculated that the change in surface pressure would be 145 millibars (4.3  inHg ), an unrealistic value incorrect by two orders of magnitude. The large error was caused by an imbalance in the pressure and wind velocity fields used as the initial conditions in his analysis. The first successful numerical prediction

610-439: A challenge, since statiscal methods continue to show higher skill over dynamical guidance. The first ocean wave models were developed in the 1960s and 1970s. These models had the tendency to overestimate the role of wind in wave development and underplayed wave interactions. A lack of knowledge concerning how waves interacted among each other, assumptions regarding a maximum wave height, and deficiencies in computer power limited

732-421: A challenge, since statistical methods continue to show higher skill over dynamical guidance. On a molecular scale, there are two main competing reaction processes involved in the degradation of cellulose , or wood fuels, in wildfires . When there is a low amount of moisture in a cellulose fiber, volatilization of the fuel occurs; this process will generate intermediate gaseous products that will ultimately be

854-514: A coarse grid that leaves smaller-scale interactions unresolved. The transfer of energy between the wind blowing over the surface of an ocean and the ocean's upper layer is an important element in wave dynamics. The spectral wave transport equation is used to describe the change in wave spectrum over changing topography. It simulates wave generation, wave movement (propagation within a fluid), wave shoaling , refraction , energy transfer between waves, and wave dissipation. Since surface winds are

976-411: A few regional models use spectral methods for the horizontal dimensions and finite-difference methods in the vertical. The National Meteorological Center's Global Spectral Model was introduced during August 1980. The European Centre for Medium-Range Weather Forecasts model debuted on May 1, 1985. The United Kingdom Met Office has been running their global model since the late 1980s, adding

1098-399: A few regional models use spectral methods for the horizontal dimensions and finite-difference methods in the vertical. These equations are initialized from the analysis data and rates of change are determined. These rates of change predict the state of the atmosphere a short time into the future; the time increment for this prediction is called a time step . This future atmospheric state

1220-584: A fixed receiver, as well as from weather satellites . The World Meteorological Organization acts to standardize the instrumentation, observing practices and timing of these observations worldwide. Stations either report hourly in METAR reports, or every six hours in SYNOP reports. These observations are irregularly spaced, so they are processed by data assimilation and objective analysis methods, which perform quality control and obtain values at locations usable by

1342-425: A full three-dimensional treatment of combustion via direct numerical simulation at scales relevant for atmospheric modeling is not currently practical because of the excessive computational cost such a simulation would require. Numerical weather models have limited forecast skill at spatial resolutions under 1 kilometer (0.6 mi), forcing complex wildfire models to parameterize the fire in order to calculate how

SECTION 10

#1732797986889

1464-513: A model is either global , covering the entire Earth, or regional , covering only part of the Earth. Regional models (also known as limited-area models, or LAMs) allow for the use of finer grid spacing than global models because the available computational resources are focused on a specific area instead of being spread over the globe. This allows regional models to resolve explicitly smaller-scale meteorological phenomena that cannot be represented on

1586-466: A model is either global , covering the entire Earth, or regional , covering only part of the Earth. Regional models (also known as limited-area models, or LAMs) allow for the use of finer (or smaller) grid spacing than global models. The available computational resources are focused on a specific area instead of being spread over the globe. This allows regional models to resolve explicitly smaller-scale meteorological phenomena that cannot be represented on

1708-615: A regional forecast model for the effects of air pollution and acid rain , was developed by a private company in the US in 1970. Development of this model was taken over by the Environmental Protection Agency and improved in the mid to late 1970s using results from a regional air pollution study. While developed in California , this model was later used in other areas of North America , Europe and Asia during

1830-435: A relatively constricted area, such as wildfires . Manipulating the vast datasets and performing the complex calculations necessary to modern numerical weather prediction requires some of the most powerful supercomputers in the world. Even with the increasing power of supercomputers, the forecast skill of numerical weather models extends to only about six days. Factors affecting the accuracy of numerical predictions include

1952-420: A simplified form of atmospheric dynamics based on solving the barotropic vorticity equation over a single layer of the atmosphere, by computing the geopotential height of the atmosphere's 500 millibars (15 inHg) pressure surface. This simplification greatly reduced demands on computer time and memory, so the computations could be performed on the relatively primitive computers of the day. When news of

2074-399: A single forecast run due to inherent uncertainty, and proposed a stochastic dynamic model that produced means and variances for the state of the atmosphere. While these Monte Carlo simulations showed skill, in 1974 Cecil Leith revealed that they produced adequate forecasts only when the ensemble probability distribution was a representative sample of the probability distribution in

2196-451: A single model-based approach, the ensemble forecast is usually evaluated in terms of an average of the individual forecasts concerning one forecast variable, as well as the degree of agreement between various forecasts within the ensemble system, as represented by their overall spread. Ensemble spread is diagnosed through tools such as spaghetti diagrams , which show the dispersion of one quantity on prognostic charts for specific time steps in

2318-545: Is a stub . You can help Misplaced Pages by expanding it . Numerical weather prediction Mathematical models based on the same physical principles can be used to generate either short-term weather forecasts or longer-term climate predictions; the latter are widely applied for understanding and projecting climate change . The improvements made to regional models have allowed significant improvements in tropical cyclone track and air quality forecasts; however, atmospheric models perform poorly at handling processes that occur in

2440-433: Is a computer program that produces meteorological information for future times at given locations and altitudes. Within any modern model is a set of equations, known as the primitive equations , used to predict the future state of the atmosphere. These equations—along with the ideal gas law —are used to evolve the density , pressure , and potential temperature scalar fields and the flow velocity vector field of

2562-566: Is a process known as superensemble forecasting . This type of forecast significantly reduces errors in model output. Air quality forecasting attempts to predict when the concentrations of pollutants will attain levels that are hazardous to public health. The concentration of pollutants in the atmosphere is determined by their transport , or mean velocity of movement through the atmosphere, their diffusion , chemical transformation , and ground deposition . In addition to pollutant source and terrain information, these models require data about

SECTION 20

#1732797986889

2684-501: Is impossible to solve these equations exactly, and small errors grow with time (doubling about every five days). Present understanding is that this chaotic behavior limits accurate forecasts to about 14 days even with accurate input data and a flawless model. In addition, the partial differential equations used in the model need to be supplemented with parameterizations for solar radiation , moist processes (clouds and precipitation ), heat exchange , soil, vegetation, surface water, and

2806-717: Is known as post-processing. Forecast parameters within MOS include maximum and minimum temperatures, percentage chance of rain within a several hour period, precipitation amount expected, chance that the precipitation will be frozen in nature, chance for thunderstorms, cloudiness, and surface winds. In 1963, Edward Lorenz discovered the chaotic nature of the fluid dynamics equations involved in weather forecasting. Extremely small errors in temperature, winds, or other initial inputs given to numerical models will amplify and double every five days, making it impossible for long-range forecasts—those made more than two weeks in advance—to predict

2928-424: Is quite extensive and dates back to the 1930s and earlier. One of the early air pollutant plume dispersion equations was derived by Bosanquet and Pearson. Their equation did not assume Gaussian distribution nor did it include the effect of ground reflection of the pollutant plume. Sir Graham Sutton derived an air pollutant plume dispersion equation in 1947 which did include the assumption of Gaussian distribution for

3050-491: Is small and the forecast solutions are consistent within multiple model runs, forecasters perceive more confidence in the ensemble mean, and the forecast in general. Despite this perception, a spread-skill relationship is often weak or not found, as spread-error correlations are normally less than 0.6, and only under special circumstances range between 0.6–0.7. The relationship between ensemble spread and forecast skill varies substantially depending on such factors as

3172-400: Is then used as the starting point for another application of the predictive equations to find new rates of change, and these new rates of change predict the atmosphere at a yet further time step into the future. This time stepping is repeated until the solution reaches the desired forecast time. The length of the time step chosen within the model is related to the distance between the points on

3294-569: Is used to forecast the track and intensity of tropical cyclones . The model was developed by the National Oceanic and Atmospheric Administration (NOAA), the U.S. Naval Research Laboratory , the University of Rhode Island , and Florida State University . It became operational in 2007. Despite improvements in track forecasting, predictions of the intensity of a tropical cyclone based on numerical weather prediction continue to be

3416-557: The European Centre for Medium-Range Weather Forecasts (ECMWF) and the National Centers for Environmental Prediction , model ensemble forecasts have been used to help define the forecast uncertainty and to extend the window in which numerical weather forecasting is viable farther into the future than otherwise possible. The ECMWF model, the Ensemble Prediction System, uses singular vectors to simulate

3538-886: The Global Forecast System (GFS) in 2001. However, though the NGM ceased widespread use in the early 2000s due to the GFS and improvements in the Eta model (later the North American Mesoscale Model ), and the NGM's short-range LAMP products were phased out in 2006, NGM MOS products continued to be in significant general use (alongside the Eta/NAM and GFS) until March 3, 2009, when the NGM MOS products were discontinued. This meteorology –related article

3660-728: The LEO computer developed by J. Lyons and Co. Following these initial experiments, work moved to the Ferranti Mark 1 computer at the Manchester University Department of Electrical Engineering and in 1959 a Ferranti Mercury computer, known as 'Meteor', was installed at the Met Office. In September 1954, Carl-Gustav Rossby assembled an international group of meteorologists in Stockholm and produced

3782-399: The National Weather Service for their suite of weather forecasting models by 1976. The United States Air Force developed its own set of MOS based upon their dynamical weather model by 1983. As proposed by Edward Lorenz in 1963, it is impossible for long-range forecasts—those made more than two weeks in advance—to predict the state of the atmosphere with any degree of skill , owing to

Nested Grid Model - Misplaced Pages Continue

3904-465: The National Weather Service for their suite of weather forecasting models in the late 1960s. Model output statistics differ from the perfect prog technique, which assumes that the output of numerical weather prediction guidance is perfect. MOS can correct for local effects that cannot be resolved by the model due to insufficient grid resolution, as well as model biases. Because MOS is run after its respective global or regional model, its production

4026-732: The Rapid Refresh (which replaced the RUC in 2012) for short-range and high-resolution applications; both the Rapid Refresh and NAM are built on the same framework, the WRF . Météo-France has been running their Action de Recherche Petite Échelle Grande Échelle (ALADIN) mesoscale model for France, based upon the ECMWF global model, since 1995. In July 1996, the Bureau of Meteorology implemented

4148-518: The Weather Research and Forecasting model tend to use normalized pressure coordinates referred to as sigma coordinates . This coordinate system receives its name from the independent variable σ {\displaystyle \sigma } used to scale atmospheric pressures with respect to the pressure at the surface, and in some cases also with the pressure at the top of the domain. Because forecast models based upon

4270-483: The chaotic nature of the fluid dynamics equations involved. Extremely small errors in temperature, winds, or other initial inputs given to numerical models will amplify and double every five days. Furthermore, existing observation networks have limited spatial and temporal resolution (for example, over large bodies of water such as the Pacific Ocean), which introduces uncertainty into the true initial state of

4392-469: The ideal gas law could be used to estimate the state of the atmosphere in the future through numerical methods . With the exception of the second law of thermodynamics, these equations form the basis of the primitive equations used in present-day weather models. In 1922, Lewis Fry Richardson published the first attempt at forecasting the weather numerically. Using a hydrostatic variation of Bjerknes's primitive equations, Richardson produced by hand

4514-432: The partial differential equations used to calculate the forecast—introduce errors which double every five days. The use of model ensemble forecasts since the 1990s helps to define the forecast uncertainty and extend weather forecasting farther into the future than otherwise possible. Until the end of the 19th century, weather prediction was entirely subjective and based on empirical rules, with only limited understanding of

4636-423: The 1920s through the efforts of Lewis Fry Richardson , who used procedures originally developed by Vilhelm Bjerknes to produce by hand a six-hour forecast for the state of the atmosphere over two points in central Europe, taking at least six weeks to do so. It was not until the advent of the computer and computer simulations that computation time was reduced to less than the forecast period itself. The ENIAC

4758-438: The 1920s, it was not until the advent of the computer and computer simulation that computation time was reduced to less than the forecast period itself. ENIAC was used to create the first forecasts via computer in 1950, and over the years more powerful computers have been used to increase the size of initial datasets and use more complicated versions of the equations of motion. The development of global forecasting models led to

4880-727: The 1980s. The Community Multiscale Air Quality model (CMAQ) is an open source air quality model run within the United States in conjunction with the NAM mesoscale model since 2004. The first operational air quality model in Canada, Canadian Hemispheric and Regional Ozone and NOx System (CHRONOS), began to be run in 2001. It was replaced with the Global Environmental Multiscale model – Modelling Air quality and Chemistry (GEM-MACH) model in November 2009. During 1972,

5002-426: The 1990s, model ensemble forecasts have been used to help define the forecast uncertainty and to extend the window in which numerical weather forecasting is viable farther into the future than otherwise possible. The atmosphere is a fluid . As such, the idea of numerical weather prediction is to sample the state of the fluid at a given time and use the equations of fluid dynamics and thermodynamics to estimate

Nested Grid Model - Misplaced Pages Continue

5124-669: The AVN model was extended to the end of the forecast period, eliminating the need of the MRF and thereby replacing it. In late 2002, the AVN model was renamed the Global Forecast System (GFS). The German Weather Service has been running their global hydrostatic model, the GME , using a hexagonal icosahedral grid since 2002. The GFS is slated to eventually be supplanted by the Flow-following, finite-volume Icosahedral Model (FIM), which like

5246-582: The Center for Ocean-Land Atmosphere Studies (COLA) model showed a warm temperature bias of 2–4 °C (36–39 °F) and a low precipitation bias due to incorrect parameterization of crop and vegetation type across the central United States. Coupled ocean-atmosphere climate models such as the Hadley Centre for Climate Prediction and Research 's HadCM3 model are currently being used as inputs for climate change studies. The importance of gravity waves

5368-494: The ClimX system was presented in a conference paper. The enhanced Fu-Xi system, along with its base version, was documented in both a preprint and a journal article. In the third study, Bach et al. (2024) utilized a hybrid dynamical and data-driven approach to show potential improvements in subseasonal monsoon prediction. Their findings indicate a correlation above 0.5 over a 46-day period in two predictions. The horizontal domain of

5490-457: The GME is gridded on a truncated icosahedron, in the mid-2010s. In 1956, Norman A. Phillips developed a mathematical model which could realistically depict monthly and seasonal patterns in the troposphere, which became the first successful climate model . Following Phillips's work, several groups began working to create general circulation models . The first general circulation climate model that combined both oceanic and atmospheric processes

5612-794: The Limited Area Prediction System (LAPS). The Canadian Regional Finite-Elements model (RFE) went into operational use on April 22, 1986. It was followed by the Canadian Global Environmental Multiscale Model (GEM) mesoscale model on February 24, 1997. The German Weather Service developed the High Resolution Regional Model (HRM) in 1999, which is widely run within the operational and research meteorological communities and run with hydrostatic assumptions. The Antarctic Mesoscale Prediction System (AMPS)

5734-486: The Pacific. An atmospheric model is a computer program that produces meteorological information for future times at given locations and altitudes. Within any modern model is a set of equations, known as the primitive equations , used to predict the future state of the atmosphere. These equations—along with the ideal gas law —are used to evolve the density , pressure , and potential temperature scalar fields and

5856-499: The Southern Hemisphere were also based on the single-layer barotropic model. Later models used more complete equations for atmospheric dynamics and thermodynamics . In 1959, Karl-Heinz Hinkelmann produced the first reasonable primitive equation forecast, 37 years after Richardson's failed attempt. Hinkelmann did so by removing small oscillations from the numerical model during initialization. In 1966, West Germany and

5978-491: The UK Unified Model) can be configured for both short-term weather forecasts and longer-term climate predictions. Along with sea ice and land-surface components, AGCMs and oceanic GCMs (OGCM) are key components of global climate models, and are widely applied for understanding the climate and projecting climate change . For aspects of climate change, a range of man-made chemical emission scenarios can be fed into

6100-609: The United Kingdom the Meteorological Office first numerical weather prediction was completed by F. H. Bushby and Mavis Hinds in 1952 under the guidance of John Sawyer . These experimental forecasts were generated using a 12 × 8 grid with a grid spacing of 260 km, a one-hour time-step, and required four hours of computing time for a 24-hour forecast on the EDSAC computer at the University of Cambridge and

6222-646: The United States began in 1955 under the Joint Numerical Weather Prediction Unit (JNWPU), a joint project by the U.S. Air Force , Navy and Weather Bureau . In 1956, Norman Phillips developed a mathematical model which could realistically depict monthly and seasonal patterns in the troposphere; this became the first successful climate model . Following Phillips' work, several groups began working to create general circulation models . The first general circulation climate model that combined both oceanic and atmospheric processes

SECTION 50

#1732797986889

6344-509: The United States began producing operational forecasts based on primitive-equation models, followed by the United Kingdom in 1972 and Australia in 1977. Later additions to primitive equation models allowed additional insight into different weather phenomena. In the United States, solar radiation effects were added to the primitive equation model in 1967; moisture effects and latent heat were added in 1968; and feedback effects from rain on convection were incorporated in 1971. Three years later,

6466-486: The United States, the first operational regional model, the limited-area fine-mesh (LFM) model, was introduced in 1971. Its development was halted, or frozen, in 1986. The NGM debuted in 1987 and was also used to create model output statistics for the United States. Its development was frozen in 1991. The ETA model was implemented for the United States in 1993 and in turn was upgraded to the NAM in 2006. The U.S. also offers

6588-646: The air velocity (wind) vector field of the atmosphere through time. Additional transport equations for pollutants and other aerosols are included in some primitive-equation high-resolution models as well. The equations used are nonlinear partial differential equations which are impossible to solve exactly through analytical methods, with the exception of a few idealized cases. Therefore, numerical methods obtain approximate solutions. Different models use different solution methods: some global models and almost all regional models use finite difference methods for all three spatial dimensions, while other global models and

6710-512: The air dispersion models developed between the late 1960s and the early 2000s used what are known as "the Briggs equations." G. A. Briggs first published his plume rise observations and comparisons in 1965. In 1968, at a symposium sponsored by Conservation of Clean Air and Water in Europe, he compared many of the plume rise models then available in the literature. In that same year, Briggs also wrote

6832-446: The atmosphere can have a significant impact on the behavior and growth of a wildfire. Since the wildfire acts as a heat source to the atmospheric flow, the wildfire can modify local advection patterns, introducing a feedback loop between the fire and the atmosphere. A simplified two-dimensional model for the spread of wildfires that used convection to represent the effects of wind and terrain, as well as radiative heat transfer as

6954-584: The atmosphere in the Northern Hemisphere . In 1956, the JNWPU switched to a two-layer thermotropic model developed by Thompson and Gates. The main assumption made by the thermotropic model is that while the magnitude of the thermal wind may change, its direction does not change with respect to height, and thus the baroclinicity in the atmosphere can be simulated using the 500-and-1,000 mb (15-and-30 inHg) geopotential height surfaces and

7076-601: The atmosphere through time. Additional transport equations for pollutants and other aerosols are included in some primitive-equation high-resolution models as well. The equations used are nonlinear partial differential equations which are impossible to solve exactly through analytical methods, with the exception of a few idealized cases. Therefore, numerical methods obtain approximate solutions. Different models use different solution methods: some global models and almost all regional models use finite difference methods for all three spatial dimensions, while other global models and

7198-419: The atmosphere. In 1966, West Germany and the United States began producing operational forecasts based on primitive-equation models , followed by the United Kingdom in 1972 and Australia in 1977. The development of limited area (regional) models facilitated advances in forecasting the tracks of tropical cyclones as well as air quality in the 1970s and 1980s. By the early 1980s models began to include

7320-659: The atmosphere. It was not until 1992 that ensemble forecasts began being prepared by the European Centre for Medium-Range Weather Forecasts , the Canadian Meteorological Centre, and the National Centers for Environmental Prediction . The ECMWF model, the Ensemble Prediction System, uses singular vectors to simulate the initial probability density , while the NCEP ensemble, the Global Ensemble Forecasting System, uses

7442-479: The atmosphere. While a set of equations, known as the Liouville equations , exists to determine the initial uncertainty in the model initialization, the equations are too complex to run in real-time, even with the use of supercomputers. These uncertainties limit forecast model accuracy to about six days into the future. Edward Epstein recognized in 1969 that the atmosphere could not be completely described with

SECTION 60

#1732797986889

7564-402: The average thermal wind between them. However, due to the low skill showed by the thermotropic model, the JNWPU reverted to the single-layer barotropic model in 1958. The Japanese Meteorological Agency became the third organization to initiate operational numerical weather prediction in 1959. The first real-time forecasts made by Australia's Bureau of Meteorology in 1969 for portions of

7686-574: The box might convect and that entrainment and other processes occur. Weather models that have gridboxes with sizes between 5 and 25 kilometers (3 and 16 mi) can explicitly represent convective clouds, although they need to parameterize cloud microphysics which occur at a smaller scale. The formation of large-scale ( stratus -type) clouds is more physically based; they form when the relative humidity reaches some prescribed value. The cloud fraction can be related to this critical value of relative humidity. The amount of solar radiation reaching

7808-564: The climate models to see how an enhanced greenhouse effect would modify the Earth's climate. Versions designed for climate applications with time scales of decades to centuries were originally created in 1969 by Syukuro Manabe and Kirk Bryan at the Geophysical Fluid Dynamics Laboratory in Princeton, New Jersey . When run for multiple decades, computational limitations mean that the models must use

7930-417: The coarser grid of a global model. Regional models use a global model to specify conditions at the edge of their domain ( boundary conditions ) in order to allow systems from outside the regional model domain to move into its area. Uncertainty and errors within regional models are introduced by the global model used for the boundary conditions of the edge of the regional model, as well as errors attributable to

8052-425: The coarser grid of a global model. Regional models use a global model for initial conditions of the edge of their domain in order to allow systems from outside the regional model domain to move into its area. Uncertainty and errors within regional models are introduced by the global model used for the boundary conditions of the edge of the regional model, as well as errors attributable to the regional model itself. In

8174-408: The combustion reaction rates themselves. History of numerical weather prediction The history of numerical weather prediction considers how current weather conditions as input into mathematical models of the atmosphere and oceans to predict the weather and future sea state (the process of numerical weather prediction ) has changed over the years. Though first attempted manually in

8296-527: The computational grid, and is chosen to maintain numerical stability . Time steps for global models are on the order of tens of minutes, while time steps for regional models are between one and four minutes. The global models are run at varying times into the future. The UKMET Unified Model is run six days into the future, while the European Centre for Medium-Range Weather Forecasts ' Integrated Forecast System and Environment Canada 's Global Environmental Multiscale Model both run out to ten days into

8418-410: The density and quality of observations used as input to the forecasts, along with deficiencies in the numerical models themselves. Post-processing techniques such as model output statistics (MOS) have been developed to improve the handling of errors in numerical predictions. A more fundamental problem lies in the chaotic nature of the partial differential equations that describe the atmosphere. It

8540-409: The development of a third generation of wave models from 1988 onward. Within this third generation of models, the spectral wave transport equation is used to describe the change in wave spectrum over changing topography. It simulates wave generation, wave movement (propagation within a fluid), wave shoaling , refraction , energy transfer between waves, and wave dissipation. Since surface winds are

8662-459: The dominant method of heat transport led to reaction–diffusion systems of partial differential equations . More complex models join numerical weather models or computational fluid dynamics models with a wildfire component which allow the feedback effects between the fire and the atmosphere to be estimated. The additional complexity in the latter class of models translates to a corresponding increase in their computer power requirements. In fact,

8784-405: The earliest models, if a column of air within a model gridbox was conditionally unstable (essentially, the bottom was warmer and moister than the top) and the water vapor content at any point within the column became saturated then it would be overturned (the warm, moist air would begin rising), and the air in that vertical column mixed. More sophisticated schemes recognize that only some portions of

8906-463: The effects of terrain. In an effort to quantify the large amount of inherent uncertainty remaining in numerical predictions, ensemble forecasts have been used since the 1990s to help gauge the confidence in the forecast, and to obtain useful results farther into the future than otherwise possible. This approach analyzes multiple forecasts created with an individual forecast model or multiple models. The history of numerical weather prediction began in

9028-446: The equations are too complex to run in real-time, even with the use of supercomputers. These uncertainties limit forecast model accuracy to about five or six days into the future. Edward Epstein recognized in 1969 that the atmosphere could not be completely described with a single forecast run due to inherent uncertainty, and proposed using an ensemble of stochastic Monte Carlo simulations to produce means and variances for

9150-466: The equations for atmospheric dynamics do not perfectly determine weather conditions, statistical methods have been developed to attempt to correct the forecasts. Statistical models were created based upon the three-dimensional fields produced by numerical weather models, surface observations and the climatological conditions for specific locations. These statistical models are collectively referred to as model output statistics (MOS), and were developed by

9272-431: The field of tropical cyclone track forecasting , despite the ever-improving dynamical model guidance which occurred with increased computational power, it was not until the 1980s when numerical weather prediction showed skill , and until the 1990s when it consistently outperformed statistical or simple dynamical models. Predictions of the intensity of a tropical cyclone based on numerical weather prediction continue to be

9394-474: The first climate models. The development of limited area (regional) models facilitated advances in forecasting the tracks of tropical cyclone as well as air quality in the 1970s and 1980s. Because the output of forecast models based on atmospheric dynamics requires corrections near ground level, model output statistics (MOS) were developed in the 1970s and 1980s for individual forecast points (locations). The MOS apply statistical techniques to post-process

9516-411: The first global forecast model was introduced. Sea ice began to be initialized in forecast models in 1971. Efforts to involve sea surface temperature in model initialization began in 1972 due to its role in modulating weather in higher latitudes of the Pacific. A global forecast model is a weather forecasting model which initializes and forecasts the weather throughout the Earth's troposphere . It

9638-596: The first model to forecast storm surge along the continental shelf was developed, known as the Special Program to List the Amplitude of Surges from Hurricanes (SPLASH). In 1978, the first hurricane-tracking model based on atmospheric dynamics – the movable fine-mesh (MFM) model – began operating. Within the field of tropical cyclone track forecasting , despite the ever-improving dynamical model guidance which occurred with increased computational power, it

9760-426: The first operational forecast (i.e. routine predictions for practical use) based on the barotropic equation. Operational numerical weather prediction in the United States began in 1955 under the Joint Numerical Weather Prediction Unit (JNWPU), a joint project by the U.S. Air Force , Navy , and Weather Bureau . The JNWPU model was originally a three-layer barotropic model, also developed by Charney. It only modeled

9882-410: The first weather forecast by ENIAC was received by Richardson in 1950, he remarked that the results were an "enormous scientific advance." The first calculations for a 24‑hour forecast took ENIAC nearly 24 hours to produce, but Charney's group noted that most of that time was spent in "manual operations", and expressed hope that forecasts of the weather before it occurs would soon be realized. In

10004-466: The forecast model and the region for which the forecast is made. In the same way that many forecasts from a single model can be used to form an ensemble, multiple models may also be combined to produce an ensemble forecast. This approach is called multi-model ensemble forecasting , and it has been shown to improve forecasts when compared to a single model-based approach. Models within a multi-model ensemble can be adjusted for their various biases, which

10126-460: The future, and the Global Forecast System model run by the Environmental Modeling Center is run sixteen days into the future. The visual output produced by a model solution is known as a prognostic chart , or prog . Some meteorological processes are too small-scale or too complex to be explicitly included in numerical weather prediction models. Parameterization is a procedure for representing these processes by relating them to variables on

10248-441: The future. Another tool where ensemble spread is used is a meteogram , which shows the dispersion in the forecast of one quantity for one specific location. It is common for the ensemble spread to be too small to include the weather that actually occurs, which can lead to forecasters misdiagnosing model uncertainty; this problem becomes particularly severe for forecasts of the weather about ten days in advance. When ensemble spread

10370-493: The governing equations of fluid flow in the atmosphere; they are based on the same principles as other limited-area numerical weather prediction models but may include special computational techniques such as refined spatial domains that move along with the cyclone. Models that use elements of both approaches are called statistical-dynamical models. In 1978, the first hurricane-tracking model based on atmospheric dynamics —the movable fine-mesh (MFM) model—began operating. Within

10492-425: The ground, as well as the formation of cloud droplets occur on the molecular scale, and so they must be parameterized before they can be included in the model. Atmospheric drag produced by mountains must also be parameterized, as the limitations in the resolution of elevation contours produce significant underestimates of the drag. This method of parameterization is also done for the surface flux of energy between

10614-468: The initial probability density , while the NCEP ensemble, the Global Ensemble Forecasting System, uses a technique known as vector breeding . The UK Met Office runs global and regional ensemble forecasts where perturbations to initial conditions are used by 24 ensemble members in the Met Office Global and Regional Ensemble Prediction System (MOGREPS) to produce 24 different forecasts. In

10736-459: The interactions of soil and vegetation with the atmosphere, which led to more realistic forecasts. The output of forecast models based on atmospheric dynamics is unable to resolve some details of the weather near the Earth's surface. As such, a statistical relationship between the output of a numerical weather model and the ensuing conditions at the ground was developed in the 1970s and 1980s, known as model output statistics (MOS). Starting in

10858-688: The model's mathematical algorithms. The data are then used in the model as the starting point for a forecast. A variety of methods are used to gather observational data for use in numerical models. Sites launch radiosondes in weather balloons which rise through the troposphere and well into the stratosphere . Information from weather satellites is used where traditional data sources are not available. Commerce provides pilot reports along aircraft routes and ship reports along shipping routes. Research projects use reconnaissance aircraft to fly in and around weather systems of interest, such as tropical cyclones . Reconnaissance aircraft are also flown over

10980-677: The ocean and the atmosphere, in order to determine realistic sea surface temperatures and type of sea ice found near the ocean's surface. Sun angle as well as the impact of multiple cloud layers is taken into account. Soil type, vegetation type, and soil moisture all determine how much radiation goes into warming and how much moisture is drawn up into the adjacent atmosphere, and thus it is important to parameterize their contribution to these processes. Within air quality models, parameterizations take into account atmospheric emissions from multiple relatively tiny sources (e.g. roads, fields, factories) within specific grid boxes. The horizontal domain of

11102-426: The open oceans during the cold season into systems which cause significant uncertainty in forecast guidance, or are expected to be of high impact from three to seven days into the future over the downstream continent. Sea ice began to be initialized in forecast models in 1971. Efforts to involve sea surface temperature in model initialization began in 1972 due to its role in modulating weather in higher latitudes of

11224-415: The output of dynamical models with the most recent surface observations and the forecast point's climatology. This technique can correct for model resolution as well as model biases. Even with the increasing power of supercomputers, the forecast skill of numerical weather models only extends to about two weeks into the future, since the density and quality of observations—together with the chaotic nature of

11346-431: The performance of the models. After experiments were performed in 1968, 1969, and 1973, wind input from the Earth's atmosphere was weighted more accurately in the predictions. A second generation of models was developed in the 1980s, but they could not realistically model swell nor depict wind-driven waves (also known as wind waves) caused by rapidly changing wind fields, such as those within tropical cyclones. This caused

11468-568: The physical mechanisms behind weather processes. In 1901 Cleveland Abbe , founder of the United States Weather Bureau , proposed that the atmosphere is governed by the same principles of thermodynamics and hydrodynamics that were studied in the previous century. In 1904, Vilhelm Bjerknes derived a two-step procedure for model-based weather forecasting. First, a diagnostic step is used to process data to generate initial conditions , which are then advanced in time by

11590-437: The primary forcing mechanism in the spectral wave transport equation, ocean wave models use information produced by numerical weather prediction models as inputs to determine how much energy is transferred from the atmosphere into the layer at the surface of the ocean. Along with dissipation of energy through whitecaps and resonance between waves, surface winds from numerical weather models allow for more accurate predictions of

11712-437: The primary forcing mechanism in the spectral wave transport equation, ocean wave models use information produced by numerical weather prediction models as inputs to determine how much energy is transferred from the atmosphere into the layer at the surface of the ocean. Along with dissipation of energy through whitecaps and resonance between waves, surface winds from numerical weather models allow for more accurate predictions of

11834-492: The primitive equations. This correlation between coordinate systems can be made since pressure decreases with height through the Earth's atmosphere . The first model used for operational forecasts, the single-layer barotropic model, used a single pressure coordinate at the 500-millibar (about 5,500 m (18,000 ft)) level, and thus was essentially two-dimensional. High-resolution models—also called mesoscale models —such as

11956-414: The quality of numerical weather guidance is the main uncertainty in air quality forecasts. A General Circulation Model (GCM) is a mathematical model that can be used in computer simulations of the global circulation of a planetary atmosphere or ocean. An atmospheric general circulation model (AGCM) is essentially the same as a global numerical weather prediction model, and some (such as the one used in

12078-460: The regional model itself. The vertical coordinate is handled in various ways. Lewis Fry Richardson's 1922 model used geometric height ( z {\displaystyle z} ) as the vertical coordinate. Later models substituted the geometric z {\displaystyle z} coordinate with a pressure coordinate system, in which the geopotential heights of constant-pressure surfaces become dependent variables , greatly simplifying

12200-782: The review by Shen et al. ). As detailed in Table 4 of Shen et al., these AI-driven models were trained with ERA5 reanalysis data and CMIP6 datasets and evaluated using a variety of metrics such as root mean square errors (RMSE), anomaly correlation coefficients (ACC), Continuous Ranked Probability Score (CRPS), Temporal Anomaly Correlation Coefficient (TCC), Ranked Probability Skill Score (RPSS), Brier Skill Score (BSS), and bivariate correlation (COR). By utilizing deep convolutional neural networks (CNNs), Weyn et al. achieved lead times of 14 days. Notably, recent advancements in AI, especially transformer models (e.g., Vaswani et al. ) and their derivatives, such as

12322-495: The scales that the model resolves. For example, the gridboxes in weather and climate models have sides that are between 5 kilometers (3 mi) and 300 kilometers (200 mi) in length. A typical cumulus cloud has a scale of less than 1 kilometer (0.6 mi), and would require a grid even finer than this to be represented physically by the equations of fluid motion. Therefore, the processes that such clouds represent are parameterized, by processes of various sophistication. In

12444-423: The section of the publication edited by Slade dealing with the comparative analyses of plume rise models. That was followed in 1969 by his classical critical review of the entire plume rise literature, in which he proposed a set of plume rise equations which have become widely known as "the Briggs equations". Subsequently, Briggs modified his 1969 plume rise equations in 1971 and in 1972. The Urban Airshed Model,

12566-443: The source of combustion . When moisture is present—or when enough heat is being carried away from the fiber, charring occurs. The chemical kinetics of both reactions indicate that there is a point at which the level of moisture is low enough—and/or heating rates high enough—for combustion processes to become self-sufficient. Consequently, changes in wind speed, direction, moisture, temperature, or lapse rate at different levels of

12688-448: The state of the fluid flow in the atmosphere to determine its transport and diffusion. Meteorological conditions such as thermal inversions can prevent surface air from rising, trapping pollutants near the surface, which makes accurate forecasts of such events crucial for air quality modeling. Urban air quality models require a very fine computational mesh, requiring the use of high-resolution mesoscale weather models; in spite of this,

12810-424: The state of the atmosphere with any degree of forecast skill . Furthermore, existing observation networks have poor coverage in some regions (for example, over large bodies of water such as the Pacific Ocean), which introduces uncertainty into the true initial state of the atmosphere. While a set of equations, known as the Liouville equations , exists to determine the initial uncertainty in the model initialization,

12932-716: The state of the atmosphere. Although this early example of an ensemble showed skill, in 1974 Cecil Leith showed that they produced adequate forecasts only when the ensemble probability distribution was a representative sample of the probability distribution in the atmosphere. Since the 1990s, ensemble forecasts have been used operationally (as routine forecasts) to account for the stochastic nature of weather processes – that is, to resolve their inherent uncertainty. This method involves analyzing multiple forecasts created with an individual forecast model by using different physical parametrizations or varying initial conditions. Starting in 1992 with ensemble forecasts prepared by

13054-677: The state of the fluid at some time in the future. The process of entering observation data into the model to generate initial conditions is called initialization . On land, terrain maps available at resolutions down to 1 kilometer (0.6 mi) globally are used to help model atmospheric circulations within regions of rugged topography, in order to better depict features such as downslope winds, mountain waves and related cloudiness that affects incoming solar radiation. The main inputs from country-based weather services are observations from devices (called radiosondes ) in weather balloons that measure various atmospheric parameters and transmits them to

13176-547: The state of the sea surface. Because forecast models based upon the equations for atmospheric dynamics do not perfectly determine weather conditions near the ground, statistical corrections were developed to attempt to resolve this problem. Statistical models were created based upon the three-dimensional fields produced by numerical weather models, surface observations, and the climatological conditions for specific locations. These statistical models are collectively referred to as model output statistics (MOS), and were developed by

13298-441: The state of the sea surface. Tropical cyclone forecasting also relies on data provided by numerical weather models. Three main classes of tropical cyclone guidance models exist: Statistical models are based on an analysis of storm behavior using climatology, and correlate a storm's position and date to produce a forecast that is not based on the physics of the atmosphere at the time. Dynamical models are numerical models that solve

13420-540: The vertical and crosswind dispersion of the plume and also included the effect of ground reflection of the plume. Under the stimulus provided by the advent of stringent environmental control regulations , there was an immense growth in the use of air pollutant plume dispersion calculations between the late 1960s and today. A great many computer programs for calculating the dispersion of air pollutant emissions were developed during that period of time and they were called "air dispersion models". The basis for most of those models

13542-428: The winds will be modified locally by the wildfire, and to use those modified winds to determine the rate at which the fire will spread locally. Although models such as Los Alamos ' FIRETEC solve for the concentrations of fuel and oxygen , the computational grid cannot be fine enough to resolve the combustion reaction, so approximations must be made for the temperature distribution within each grid cell, as well as for

13664-513: The “vision transformer” (Dosovitskiy et al. 2020 ), have created substantial opportunities to lower the cost of weather forecasting and revisit the predictability limits. Among the AI-powered models mentioned, all provided forecasts that were comparable to or slightly better than those from PDE-physics-based systems for short-term forecasts (3–14 days). Three studies have attempted to conduct simulations at subseasonal or larger scales. Of these,

13786-510: Was developed for the southernmost continent in 2000 by the United States Antarctic Program . The German non-hydrostatic Lokal-Modell for Europe (LME) has been run since 2002, and an increase in areal domain became operational on September 28, 2005. The Japan Meteorological Agency has run a high-resolution, non-hydrostatic mesoscale model since September 2004. The technical literature on air pollution dispersion

13908-517: Was developed in the late 1960s at the NOAA Geophysical Fluid Dynamics Laboratory . As computers have become more powerful, the size of the initial data sets has increased and newer atmospheric models have been developed to take advantage of the added available computing power. These newer models include more physical processes in the simplifications of the equations of motion in numerical simulations of

14030-481: Was developed in the late 1960s at the NOAA Geophysical Fluid Dynamics Laboratory . By the early 1980s, the United States' National Center for Atmospheric Research had developed the Community Atmosphere Model; this model has been continuously refined into the 2000s. In 1986, efforts began to initialize and model soil and vegetation types, which led to more realistic forecasts. For example,

14152-696: Was neglected within these models until the mid-1980s. Now, gravity waves are required within global climate models in order to properly simulate regional and global scale circulations, though their broad spectrum makes their incorporation complicated. The Climate System Model (CSM) was developed at the National Center for Atmospheric Research in January 1994. In comparison to traditional physics-based methods, machine learning (ML), or more broadly, artificial intelligence (AI) approaches, have demonstrated potential in enhancing weather forecasts (refer to

14274-434: Was not until the decade of the 1980s when numerical weather prediction showed skill , and until the 1990s when it consistently outperformed statistical or simple dynamical models. In the early 1980s, the assimilation of satellite-derived winds from water vapor, infrared, and visible satellite imagery was found to improve tropical cyclones track forecasting. The Geophysical Fluid Dynamics Laboratory (GFDL) hurricane model

14396-484: Was performed using the ENIAC digital computer in 1950 by a team led by American meteorologist Jule Charney . The team include Philip Thompson, Larry Gates, and Norwegian meteorologist Ragnar Fjørtoft , applied mathematician John von Neumann , and computer programmer Klara Dan von Neumann , M. H. Frankel, Jerome Namias , John C. Freeman Jr., Francis Reichelderfer , George Platzman , and Joseph Smagorinsky . They used

14518-431: Was that it assumed the hydrostatic equation . The NGM debuted in 1987, directly replacing the limited-area fine mesh (LFM) model, which was immediately halted upon the NGM's debut. The NGM was also used to create model output statistics . Development of the model stopped in 1993. By 2000, the model was seen as obsolete, particularly for mesoscale features that were not hydrostatic, and was scheduled to be superseded by

14640-580: Was the Complete Equation For Gaussian Dispersion Modeling Of Continuous, Buoyant Air Pollution Plumes The Gaussian air pollutant dispersion equation requires the input of H which is the pollutant plume's centerline height above ground level—and H is the sum of H s (the actual physical height of the pollutant plume's emission source point) plus Δ H (the plume rise due to the plume's buoyancy). To determine Δ H , many if not most of

14762-429: Was used for research purposes between 1973 and the mid-1980s. Once it was determined that it could show skill in hurricane prediction, a multi-year transition transformed the research model into an operational model which could be used by the National Weather Service in 1995. The Hurricane Weather Research and Forecasting (HWRF) model is a specialized version of the Weather Research and Forecasting (WRF) model and

14884-479: Was used to create the first weather forecasts via computer in 1950, based on a highly simplified approximation to the atmospheric governing equations. In 1954, Carl-Gustav Rossby 's group at the Swedish Meteorological and Hydrological Institute used the same model to produce the first operational forecast (i.e., a routine prediction for practical use). Operational numerical weather prediction in

#888111