Datasets for inverse modeling and ESM evaluation at high latitudes. Observation and model data from different sources, including the three existing NCoEs (CRAICC, SVALI and DEFROST) are being integrated into databases at NILU and MetNo.
Two complementary inversion frameworks, FLEXINVERT and Carbon Tracker-Europe (CTE), are being developed in eSTICC for the estimation of surface fluxes of climate relevant species, in particular, methane. Read more about FLEXINVERT and CTE here.
eSTICC is developing state-of-the-art process parameterisations for ESMs. These are based on the understanding of greenhouse gas fluxes, aerosol dynamics, cryosphere-atmosphere interactions and boundary layer dynamics gained in other NCoEs (CRAICC, DEFROST and SVALI). Descriptions and links to the new parameterisation routines coming soon.
Integration of Nordic Earth System Models
eSTICC is working towards an integration of the Nordic ESM models and research activities. Currently, the following ESMs are being used in Nordic countries: EC-Earth, NorESM, and MPI-ESM. eSTICC will facilitate a better positioning of the Nordic countries for the next IPCC assessment, by:
- implementing and testing process parameterisations in the eSTICC ESMs.
- designing model experiments relevant for the Arctic that can be run by more than one “Nordic” ESM.
- explicitly qualifying at least one model for access to European elite HPC resources.
Using EC-Earth, NorESM and MPI-ESM, all the ESM teams will pursue experiments to find a quasi-optimal balance between process sophistication, on one hand, and grid resolution on the other. To diagnose the transport processes specifically in the different model versions, tracing oxygen isotopes in some or all of the models will be investigated.
UHEL, FMI and CSC are working towards approval of the MPI-ESM for PRACE (Partnership for Advanced Computing in Europe) Preparatory Access for Tier-0 computers. A preparatory access for Tier-0 computers means that the ESM must have passed the most stringent PRACE evaluation criteria in code scalability testing and porting. This activity could open up European elite HPC resources for the Nordic groups using MPI-ESM. The experience gained from this work will be of interest also for the groups using EC-Earth and NorESM, since they may wish to follow the same process at a later stage.
High Performance Computing (HPC) covers whole workflows, including data transfer and input/output, as well as algorithms suited for accelerated systems (GPGPU, MIC). In eSTICC, HPC mainly focuses on:
- Porting, performance evaluation and analysis: Starting from existing scientific problems (often manifested in existing non-optimized code) and available computing platforms this topic shall provide scientists with the possibility to evaluate the portability of their code and provide them with information/support on how to achieve this goal. Given by the raise of new energy-efficient technologies deployed in state-of-the art HPC platforms, such as General Purpose Graphic Processing Units (GPGPU) and Many Integrated Cores (MIC), this involves new programming paradigms.
- HPC Workflows: Earth System Models (ESM) utilize and also produce a vast amount of data. Besides the bottlenecks within codes (see previous point), limitations imposed model in- and output (I/O) as well as couplers in an increasing manner determine performance of computational studies. This topic hence shall apply existing workflow solutions (e.g., known from Grid computing) and investigate new concepts in order to boost the over-all performance starting from pre-processing over I/O during computations to pre-processing (visualization, data filtering)
- Porting Elmer/Ice to the Nordic High Performance Computing (NHPC) facility at University of Iceland: The installation shall enable easy access to HPC simulations for Icelandic glaciologists from the existing NCoE, SVALI at the Icelandic Meteorological Institute (IMO) and the University of Iceland
- Porting NEMO GPU version to CSC’s high energy efficient Bull B715 platform
- Visualising results obtained with NEMO using HPC visualisation tool ParaView. See the following page for results.
Contact person for 2015: Helmut Neukirchen (UICE)