@andreasruopp

Multi-scale Modelling of Urban Air Pollution with Coupled Weather Forecast and Traffic Simulation on HPC Architecture

, , , , and . The International Conference on High Performance Computing in Asia-Pacific Region Companion, page 9–10. New York, NY, USA, Association for Computing Machinery, (Jan 20, 2021)
DOI: 10.1145/3440722.3440917

Abstract

Urban air pollution is one of the global challenges to which over 3 million deaths are attributable yearly. Traffic is emitting over 40% of several contaminants, like NO2 10. The directive 2008/50/EC of the European Commission prescribes the assessment air quality by accumulating exceedance of contamination concentration limits over a one-year period using measurement stations, which may be supplemented by modeling techniques to provide adequate information on spatial distribution. Computational models do predict that small scale spatial fluctuation is expected on the street level: local air flow phenomena can cluster up pollutants or carry them away far from the location of emission 2. The spread of the SARS-CoV-2 virus also interacts with urban air quality. Regions in lock down have highly reduced air pollution strain due to the drop of traffic 4. Also, correlation between the fatality rate of a previous respiratory disease, SARS 2002, and Air Pollution Index suggests that bad air quality may double fatality rate 6. At street level pollution dispersion highly depends on the daily weather, a one-year simulation low time scale model is needed. Additionally, to resolve street-level phenomena a cell size of 1 to 4 meters are utilized in these regions that requires CFD methods to use a simulation domain of 1 to 100 million cells. Memory and computational requirements for these tasks are enormous, so HPC architecture is needed to have reasonable results within a manageable time frame. To tackle this challenge, the Urban Air Pollution (UAP) workflow is developed as a pilot of the HiDALGO project 7, which is funded by the H2020 framework of the European Union. The pilot is designed in a modular way with the mindset to be developed into a digital twin model later. Its standardized interfaces enable multiple software to be used in a specific module. At its core, a traffic simulation implemented in SUMO is coupled with a CFD simulation. Currently OpenFOAM (v1906, v1912 and v2006) and Ansys Fluent (v19.2) are supported. This presentation focuses on the OpenFOAM implementation, as it proved more feasible and scalable on most HPC architectures. The incompressible unsteady Reynolds-averaged Navier– Stokes equations are solved with the PIMPLE method, Courant-number based adaptive time stepping and transient atmospheric boundary conditions. The single component NOx-type pollution is calculated independently as a scalar with transport equations along the flow field. Pollution emission is treated as a per cell volumetric source that changes in time. The initial condition is obtained from a steady state solution at the initial time with the SIMPLE method, using the identical, but stationary boundary conditions and source fields. Custom modules are developed for proper boundary condition and source term handling. The UAP workflow supports automatic 3D air flow geometry and traffic network generation from OpenStreetMap data. Ground and building information are used for geometry, road network for traffic, and further assets for visualization. The CFD 3D mesh generation is done by either an in-house octree-mesh generator, or the snappyHexMesh utility from OpenFOAM. Meteorological data for boundary conditions are acquired from ECMWF using the Polytope REST API 5 automatically for the user specified day and location. The values at the closes grid point is selected, transformed into the Euclidean coordinate system, and converted OpenFOAM readable file format. A custom OpenFOAM module ensures the proper handling of the altitude and time dependent boundary field. Background air quality data is acquired from the Copernicus AMS 3. Results are validated against a local air quality sensor network which is under expansion for more accuracy. Traffic simulation or data can be obtained from external sources. For the pilot, a traffic sensor network of camera and loop detectors are installed in the Hungarian city of Győr. Sensory data is transmitted real time and is to be coupled into the simulation directly. Random traffic generation is also supported. Emission is computed by an in-house tool from traffic simulation results in SUMO data file format by applying the Copert model, interpolated to the CFD mesh and stored to an OpenFOAM readable file format. A custom OpenFOAM module is responsible for the timely read of this source term data and the proper adjustments on the equations. Calculation of the wind flow and the pollution dispersion is the most computationally heavy part of the workflow, as input file generation and traffic simulation top at 60 and 20 minutes on one node, respectively, for one day, in comparison to the minimum 2 hours for the smallest cell count CFD model. For benchmarking purposes, only the runtime of a small portion (15-60 minutes) of the scalable part of the OpenFOAM simulation is measured for the transient and 600 iterations for the steady state simulation. Primary benchmarks are issued on the local cluster PLEXI (18 node 2x6 core Intel X5650, 48GB RAM, 40Gb InfiniBand) with additional investigations on EAGLE (PSNC, Poznan, 1119 node 2x14 core Intel E5-2697v3, 64GB RAM, 56Gb InfiniBand) and the HAWK Test System (HLRS, Stuttgart, 5632 node 2x128 core AMD EPYC 7742, 256GB RAM, 200 Gb InfiniBand HDR200). Tuning OpenFOAM settings with optimized IO, multilevel decomposition and cell index renumbering improved speedup on PLEXI from 18 to 102 for the 1M cell count and from 49 to 77 on the 9M cell count model at 216 cores. On the HAWK Test System, speedups top at 133 for 1M and 401 for 9M cells, both at 2048 cores. On EAGLE, the 1M cell count model tops speedup at 104 at 448 cores. Saturating effect at one node core count suggests memory bandwidth limited calculations. Full day simulation runs were also done for areas within 5 cities (Győr, Madrid, Stuttgart, Herrenberg and Graz) with random traffic and different mesh sizes of ca. 0.8M and ca. 3M cells. The runtime of the full CFD module on PLEXI at 48 cores in 2.7 and 20 hours on average for the smaller and larger cell count, respectively. This puts the one-year simulation within reach for coarse meshes on PLEXI and finer meshes on the more powerful HPC architectures. Due to a high core count to memory channel ratio for AMD processors, poor single node parallel efficiency is expected on memory bandwidth limited applications, which supports our present findings, and are comparable with speedup results of other CFD software on the same hardware 9. Node based speedup of certain OpenFOAM simulations, however, may show superlinear behavior 1. In conclusion, the UAP workflow and the OpenFOAM implementation of the CFD module are on good track for reaching the goal of simulating one year within a manageable time frame. The few hours long simulation time for one day’s pollution also makes the current version feasible for forecasting. Future work includes using proper orthogonal decomposition, POD8, which is a model order reduction method for eventually improving calculation time drastically while sacrificing limited accuracy. We also plan testing and benchmarking GPGPU based solvers, implementing reaction for pollutants and extend validation using new air quality measuring stations.

Links and resources

Tags