Example


A simple example demonstrates the idea of adaptive mesh refinement:
The six pictures below are taken from a 2-D compressible gas dynamic simulation in which heavier portions of a fluid (brown-red) are initially resting on top of lighter portions (blue). Apart from the density perturbation, the fluid is initially in force balance beween gravitation and the thermal pressure gradient. The density change across the transition layer is 20%, the perturbation an other few procent.

A time-dependent computation based on a Runge-Kutta integrator with a 2-dimensional version of the Kurganov-Levy CWENO scheme simulates the temporal developent of this system. Boundary conditions are periodic in the horizontal direction and no-flow at the bottom and top.
t=0.0 Click for higher resolution t=0.3 Click for higher resolution t=0.55 Click for higher resolution
t=0.775 Click for higher resolution t=0.925 Click for higher resolution t=1.0625 Click for higher resolution

As can be seen from the snapshots, the heavier fluid portions sink downwards and get compressed to even higher density. Note that the colour scaling changes between the pictures: In the base of the downward stream in the very last picture, the density reaches values of 2.75 as compared to peak values of 1.17 in the initial configuration.

Obviously, the downstreaming fluid creates a turbulent wake, consisting of eddy-like structures that are generated by Kelvin-Helmholtz modes in the shear flow. To resolve these features efficiently, the critical regions of the flow are covered by successively refined meshes, while the less critical regions use coarser, and thereby computaionally cheaper, grids.

Following are the stages of the third, 5th and 6th picture from above, magnified to a 1:1 aspect ratio and with the actual grid coverage superimposed. Each of the subgrids consists of 10 x 16 cells, irrespective of size. In this example, we used a refinement criterion based on the gradient of the mass density. A combination with a flow-based criterion, e.g. the vorticity, might be an interesting alternative?

From the upper picture, it becomes clear that only a small fraction of the domain is represented with a high resolution. By far the biggest part is resolved by what amounts to only 160 x 256 grid cells globally, saving quite a lot of computing time. However, the smallest subgrids correspond to a global resolution of 10240 x 16384 cells.

In the second picture, the adaptive aspect of the mesh refinement can be seen very clearly: With the growth and downward motion of the vortex structures and the corresponding formation of density gradients, high resolution meshes are created where necessary (and removed when no more needed). This cluster of refined grids follows the overall evolution.

Even in the last picture, where the turbulent region has spread considerably, only one quarter of the entire domain is covered by grids of the highest resolution. As by far the most computation time is needed for the CWENO-reconstruction and the flux computation, there is still a good trade-off compared to a simulation with one homogeneous grid - not to talk about the computation time spent to get here.
At this stage, the domain is covered by 5011 separate grids, 4080 of which are the smallest "Level 7" grids (You might want to check that I've counted 'em correctly?).
Also note that, because of the periodic boundary conditions, the structure re-enters the domain from the left.

An here's a zoom-in of the red rectangle region in the above (with re-scaled colours) - still resolved with approx. 130 x 100 cells and a 3rd order CWENO scheme.
Click for alternative view


The MPEG-movies, taken from a similar run as the one described above, give a good impression of the temporal behaviour:
The temporal evolution of the mass density (colour scale blue to red, file size ca. 1 MByte)
As above, with the grid coverage of the domain shown. Each grid consist of 16 x 28 cells (file size ca. 1 MByte).