Data assimilation in a modified shallow water model
A modified shallow water model is used as a test model for convective-scale data assimilation. The model description is published here or shortly introduced below. The impact of the different observation strategies (spatial and temporal resolution) on the analysis and forecast qualities has been investigated. It has been found that high resolution observations tend to introduce too much noise into the model. Although theses analyses are good, the forecasts that evolve from them have a very large error growth and soon are even worse than forecasts which evolve from assimilations with lower resolved observations.
Modified shallow water model
The modified shallow water model is a simplified model for cumulus convection , with the aim of providing a computationally inexpensive, but physically plausible, environment for developing methods for convective-scale data assimilation. This model is part of a hierarchy of models, where a stochastic toy model (mentioned below) already exists and is the first in a series of models. The modified shallow water model is an intermediate model. An idealised convection resolving model (COSMO KENDA) is the last step of the hierarchy.
Key processes, including gravity waves, conditional instability and precipitation formation, are represented, and parameter values are chosen to reproduce the most important space and time scales of cumulus clouds. The model is able to reproduce the classic life cycle of an isolated convective storm. When provided with a low amplitude noise source to trigger convection, the model produces a statistically steady state with cloud size and cloud spacing distributions similar to those found in radiative-convective equilibrium simulations using a cloud resolving model.
The model features prognostic variables for wind and rain that can be used to compute synthetic observations for data assimilation experiments. These observation can mimic radar and radial wind observations. An LETKF is used for the data assimilation experiments with this model.
The impact of localization and observation averaging for convective-scale data assimilation in a simple stochastic model
A simple stochastic model is used as a test-bed for convective-scale data assimilation methods. The simple model mimics the extreme nonlinearity and non-Gaussianity associated with rapidly developing and intermittent convective storms. In this framework, I evaluate the ETKF (Ensemble Transform Kalman Filter) and SIR (Sequential Importance Resampling) ﬁlters, and assess the impact of two strategies to improve their performance and efﬁciency: localization and observation averaging.
Publication of this work in the QJRMS: http://onlinelibrary.wiley.com/doi/10.1002/qj.1980/abstract
Presentation at the EMS Annual Meeting, 12-16 September 2011 in Berlin.:
Michael Würsch 15.4.2014