Dies ist eine alte Version des Dokuments!


Inhaltsverzeichnis

Master thesis topics

External projects

Form

Topics for master theses in the theory group

Sudden stratospheric warmings and stratosphere-troposphere coupling

The stratospheric circulation in winter is dominated by a strong cyclonic vortex over the pole. This polar vortex is caused by radiative cooling due to polar night. However, upward propagating planetary waves frequently perturb the vortex in terms of both position and strength. For sufficiently strong planetary wave forcing, these disturbances can grow enough to destroy the polar vortex as a well-organized entity. These abrupt transitions in the hemispheric-scale stratospheric circulation are usually associated with a strong warming of the polar stratosphere and are thus often called sudden stratospheric warmings (SSWs). Furthermore, SSWs tend to produce circulation anomalies at the surface.

In this Master project we will explore the dynamical evolution leading up to and following SSW events based on meteorological reanalysis data and climate model output.

For more detailed information please contact Thomas Birner.

Assimilation of cloud information

The assimilation of cloud-related observations is challenging as traditional error metrics (e.g. RMS error) are not really suitable for the evaluation of intermittent fields as clouds or precipitation (double penalty problem). To overcome this deficiency, various feature-based scores have been developed for the verification of forecast precipitation fields, but these scores have not yet been applied in the context of data assimilation.

The goal of this thesis ist to test the use of feature-based metrics for the assimilation of cloud-affected satellite observations in the emsemble data assimilation system KENDA for the regional weather forecast model COSMO-DE. Different approaches shall be tested in an idealized setup of KENDA that is used by several people in the HErZ data assimilation group.

Contact: Martin Weissmann, Leonhard Scheck

Tropospheric moisture variability and the development of tropical convection

Observations and high-resolution numerical simulations both show that tropospheric humidity affects the development of deep moist convection, with a drier troposphere typically limiting or delaying the apparition of the deepest clouds. This phenomenon is generally explained by the fact that a dry environment can quickly erode the cloud core through lateral mixing (called entrainment). Although convective cloud parameterizations usually account for entrainment, the influence of environmental humidity is generally not well captured, in part because the host model (typically a weather prediction or climate model) cannot resolve all the small spatial humidity fluctuations.

In this project, a Cloud Resolving Model (CRM) operating at about 100m resolution will be used to examine the connections between the development of deep clouds and small scale moisture fluctuations in the free troposphere. The analysis of high-resolution model data should constitute the first step towards a parameterization of convection that could explicitly account for unresolved moisture variability.

Contact: Julien Savre

Predictability of convection in a very big ensemble

Convection-permitting ensemble prediction systems (EPS) are nowadays operationally run at various Meteorological Services to represent forecast uncertainty of local weather. However, the ensemble size of typical EPS ranges from 12 to 20 members due to computational constraints. In cooperation with the RIKEN Center for Computational Science in Japan 1000 member ensemble forecasts at kilometre-scale have been performed for several high impact weather events across Germany.

The aims of this Master project are to investigate the ensemble size necessary to capture forecast uncertainty realistically and to examine saturation limits of predictability using traditional and spatial methods. Possible evaluation metrics contain the gridpoint based RMSE and its upscaled variant as well as the believable scale of the Fraction Skill Score FSS. The Gaussianity of variables will be assessed using histograms.

A 1000 member ensemble forecast tailored to a weather event in Germany has not been performed so far and offers great new research opportunities. A short-term research visit in Kobe (J) may be possible during the course of this Master project.

Contact: George Craig, Christian Keil

Applicability of lossy compression methods to meteorological applications

The storage space requirements for output of numerical weather and climate prediction is growing faster than the cost of storage space is decreasing. Model resolution is continuously increased to overcome issues with parameterized physical processes like convection. At the same time the ensemble size (e.g. the number of model runs performed to create one forecast) is also increased to improve the assessment of uncertainty within the forecast. Both in combination results in really big data sets that are not only difficult to process but also very expensive to store.

One way to reduce the amount of output is to rely more on online diagnostics and not to save large fractions of the output. While this approach appears promising in operational setups of weather services, it is only a partial solution in research. Visualization of arbitrary aspects of a model run would no longer be possible and experiments would have to be carried out again if changes are made to the online diagnostic.

Another way is to store model output with reduced precision. File formats currently in use support only lossless compression or if lossy compression is possible only spatial correlation between neighboring data points is used (e.g., JPEG compression). The temporal correlation is ignored. In contrast, video compression algorithms are essentially based on the temporal correlation between successive time steps. Without reducing the quality, this results in a compression ratio that is by one order of magnitude higher than that of individual images. Central ideas of video compression should be directly transferable to the compression of model output. Examples are differential coding (only differences between time steps are stored) and motion compensation (for unchanged but moved parts of an image only a displacement vector is stored). On the other hand, assumptions about the perception by the human eye are not applicable (e.g., changes in brightness and color are not equally important).

The following questions should be addressed in this thesis:
Which compression algorithms are best suited for meteorological model output? Candidates are video compression and general purpose algorithms. The plan is not to develop new algorithms, but to asses existing ones.
What are the characteristics of errors produced by the analyzed algorithms?
Which degree of compression is acceptable for a set of different meteorological applications?

Contact: Robert Redl

Further themes are possible, please talk to Prof. G. Craig, Dr. C. Keil or Dr. M. Weissmann.