Work Package 2: Implementation

  • Deliverable 2.1 - Initial Report on Exascale Technology State-of-the-Art

  • Deliverable 2.2Initial Report on the ExaFLOW Algorithms, Energy Efficiency and IO Strategies


Work Package 3: Validation and Use Cases

  • Deliverable 3.1Detailed Description of Use Cases and their Requirements


Work Package 4: Dissemination and Exploitation


Work Package 5: Management

Below are listed the publications that members of the ExaFLOW project have authored during the first year of the project:

  • Jacobs, C. T., Jammy, S. P., Sandham, N. D. (2017). OpenSBLI: A framework for the automated derivation and parallel execution of finite difference solvers on a range of computer architectures. Journal of Computational Science, 18:12-23, DOI: http://doi.org/10.1016/j.jocs.2016.11.001

  • Jammy, S. P., Jacobs, C. T., Sandham, N. D. (In Press). Performance evaluation of explicit finite difference algorithms with varying amounts of computational and memory intensity. Journal of Computational Science. DOI: http://doi.org/10.1016/j.jocs.2016.10.015

  • Michael Bareford, Nick Johnson, and Michèle Weiland. 2016. On the trade-offs between energy to solution and runtime for real-world CFD test-cases. In Proceedings of the Exascale Applications and Software Conference 2016 (EASC '16). ACM, New York, NY, USA, Article 6, 8 pages. DOI: http://dx.doi.org/10.1145/2938615.2938619

  • D. Moxey, C. D. Cantwell, G. Mengaldo, D. Serson, D. Ekelschot, J. Peiró, S. J. Sherwin and R. M. Kirby, Towards p-adaptive spectral/hp element methods for modelling industrial flows, International Conference on Spectral and High-Order Methods, 2016. Find abstract here

  • Sandham, N. D., Johnstone, R., Jacobs, C. T. (Accepted). Surface-sampled simulations of turbulent flow at high Reynolds number. International Journal for Numerical Methods in Fluids. DOI: http://dx.doi.org/10.1002/fld.4395

  • A. S. Nielsen and J. S. Hesthaven, 2016. Fault Tolerance in the Parareal Method. Proceedings of the Fault Tolerance for HPC at eXtreme Scale Workshop at the 25th ACM Symposium on High-Performance Parallel and Distributed Computing. DOI: 10.1145/2909428.2909431

  • Offermans, N., Marin, O., Schanen, M., Gong, J., Fischer, P., Schlatter, P. On the Strong Scaling of the Spectral Element Solver Nek5000 on Petascale Systems in Proceedings of the Exascale Applications and Software Conference 2016. DOI: https://doi.org/10.1145/2938615.2938617 

  • N. D. Sandham, R. Johnstone, C. T. Jacobs. Surface-sampled simulations of turbulent flow at high Reynolds number. International Journal for Numerical Methods in Fluids, Accepted. DOI: http://dx.doi.org/10.1002/fld.4395

 

 

Nek5000: The code is available for download and installation from either a Subversion Control Repository or Git. Links to both repositories are given at: http://nek5000.mcs.anl.gov/install/

The git repository always mirrors the svn. Official releases are not in place since the nek community users and developers prefer immediate access to their contributions. However, since the software is updated on constant basis, tags for stable releases as well as latest releases are available, so far only for the Git mirror of the code.

The reason for this is that SVN is maintained mainly for senior users who already have their own coding practices, and will be maintained at Argonne National Laboratory (using respective account at ANL); the git repository is maintained at github. A similar procedure is followed for the documentation to which developers/users are free to contribute by editing and adding descriptions of features, and these are pushed back to the repository by issuing pull requests. These allow the nek team to assess whether publication is in order. All information about these procedures are documented on the homepage http://nek5000.msc.anl.gov/. KTH maintains a close collaboration with the Nek team at ANL.

The code is daily run through a series of regression tests via buildbot (to be transferred to jenkins). The checks range from functional testing, compiler suite testing to unit testing. So far not all solvers benefit of unit testing but work is ongoing in this direction Successful runs of buildbot determine whether a version of the code is deemed stable.

A suite of examples is available with the source code, examples which illustrate modifications of geometry as well as solvers and implementations of various routines. Users are encouraged to submit their own example cases to be included in the distribution.

The use cases withing ExaFLOW which involve Nek5000 will be packaged as examples and included in the repository for future reference.

 

Nektar++ : The code is a tensor product based finite element package designed to allow one to construct efficient classical low polynomial order h-type solvers (where h is the size of the finite element) as well as higher p-order piecewise polynomial order solvers. The framework currently has the following capabilities:

  • Representation of one, two and three-dimensional fields as a collection of piecewise continuous or discontinuous polynomial domains.
  • Segment, plane and volume domains are permissible, as well as domains representing curves and surfaces (dimensionally-embedded domains).
  • Hybrid shaped elements, i.e triangles and quadrilaterals or tetrahedra, prisms and hexahedra.
  • Both hierarchical and nodal expansion bases.
  • Continuous or discontinuous Galerkin operators.
  • Cross platform support for Linux, Mac OS X and Windows.

Nektar++ comes with a number of solvers and also allows one to construct a variety of new solvers. In this project we will primarily be using the Incompressible Navier Stokes solver. 

 

SBLI: The SBLI code solves the governing equations of motion for a compressible Newtonian fluid using a high-order discretisation with shock capturing. An entropy splitting approach is used for the Euler terms and all the spatial discretisations are carried out using a fourth-order central-difference scheme. Time integration is performed using compact-storage Runge-Kutta methods with third and fourth order options. Stable high-order boundary schemes are used, along with a Laplacian formulation of the viscous and heat conduction terms to prevent any odd-even decoupling associated with central schemes.

SBLI code

 

NS3D: The DNS code ns3d  is based on the complete Navier-Stokes equations for compressible fluids with the assumptions of an ideal gas and the Sutherland law for air.  The differential equations are discretized in streamwise and wall-normal directions with 6th-order compact or 8th-order explicit finite differences.  Time integration is performed with a four-step, 4th-order Runge-Kutta scheme.  Implicit and explicit filtering in space and time is possible if resolution or convergence problems occur.  The code has been continuously optimized for vector and massive-parallel computer systems until the current Cray XC40 system.  Boundary conditions for sub- and supersonic flows can be appropriately specified at the boundaries of the integration domain.  Grid transformation is used to cluster grid points in regions of interest, e.g. near a wall or a corner.  For parallelization, the domain is split into several subdomains as illustrated in the figure.

NS3D code

Illustration of grid lines (black) and subdomains (red).  A small step is located at Rex=3.3E+05.

 

 

Download material on "ExaFLOW use case for SBLI: numerical simulation of the compressible flow over a NACA-4412 airfoil at incidence"

 

Wing profile NACA4412 in incompressible flow: The flow over aircraft wings is, for obvious reasons, a very important use case for any type of scale-resolved numerical simulation. The improved understanding of the complex physical behaviour of the turbulence will pave the way to improved engineering design, including:

i) reduced air resistance and thus lower fuel consumption,

ii) increased robustness of wings with respect to external influences such as atmospheric turbulence, high angles of attack, icing,

iii) improved wing characteristics such as lower landing speeds,

iv) better flow quality along the wing and thus more effective control surfaces.

The flow along an airfoil is subject to a number of highly relevant and only partially understood physical phenomena; transition to turbulence close to the leading edge, turbulent boundary layers under adverse and favourable pressure gradients, potential flow separation close to the trailing edge, and finally free turbulence in the wake. All these aspects have so far only been studied in isolated test cases, and shall now be put together into one case, where their interaction is investigated. The high Reynolds number characteristic of airfoil flows poses another difficulty, in particular because there are clear differences between the flow at low Reynolds number and higher ones; therefore, any upscaling method is bound to fail.

For the present use case, we will consider an idealised NACA4412 wing profile, which is a generic geometry frequently used as an industrial and academic test case to validate simulation and experimental methods. The spanwise direction will initially be based on periodic boundary conditions, but at a later stage will be supplemented with wing tips, which will allow us to study end effects (note that this region is crucial when it comes to drag reduction of wings). Looking toward exascale computing (and further) the challenge is to run this sort of test case at Reynolds numbers of O(106-107), with different approaches including DNS, LES with resolved near-wall layers and LES with wall models (including heterogeneous models as discussed in the proposal).

To study fundamental aspects of fluid turbulence at low speeds, in particular in complex geometries, the incompressible approximation has a number of both numerical and physical advantages. For the present case, we will use Nek5000 to simulate the airflow along a wing section; such a resolved DNS will require about 10-100 billion grid points. The adaptive mesh strategy will thus be of particular importance in the turbulent wake of the airfoil where little information about the flow is known.

The proposed simulations will be compared with planned experiments in the KTH wind tunnel at comparable Reynolds numbers.

 

 Figure 1: Turbulent structures extracted from the incompressible NACA4412 wing case

 

NACA4412 in compressible flow: In this use case we adopt the NACA4412 airfoil in compressible subsonic flow. In the compressible flow case exhibits aspects of hyperbolic behaviour and sustains sound waves that are not resolved in incompressible flows. This change leads to very different numerical algorithms, since the divergence-free constraint on the velocity is removed. It is important to assess the scaling to exa-scale for such cases to compare the requirements with those of incompressible flow. The compressible flow solution include the acoustic field, making possible predictions of the aeroacoustic properties of airfoils. The use case considers a representative airfoil showing laminar separation, vortex shedding and sound generation from the trailing edge. The objective of the numerical approach is to efficiently resolve both the hydrodynamic near field and the acoustic far field, which have different grid requirement. The use case contains an example of initially two-dimensional flow with a variable spanwise extent that can be adjusted to consider strong and weak scaling. 

 NACA4412compressible

 

Figure 2: Instantaneous contours of spanwise vorticity from a simulation of air flow past a NACA-4412 airfoil

 

 CODES USED: Nek5000, SBLI

 

 

The main goal of ExaFLOW is to address key algorithmic challenges in CFD (Computational Fluid Dynamics) to enable simulation at exascale, guided by a number of use cases of industrial relevance, and to provide open-source pilot implementations.

The project comprises of the following 4 use cases: