PinCFlow.jl: An idealized-atmospheric-flow solver coupled to the 3D transient gravity-wave model MS-GWaM
Introduction
PinCFlow.jl (Pseudo-inCompressible Flow solver) is an atmospheric-flow solver that was designed for conducting idealized simulations. It integrates the Boussinesq, pseudo-incompressible and compressible equations in a conservative flux form (Klein, 2009; Rieper et al., 2013), using a semi-implicit method that combines explicit and implicit time-stepping schemes (Benacchio & Klein, 2019; Schmid et al., 2021; Chew et al., 2022). Spatially, the equations are discretized with a finite-volume method, such that all quantities are represented by averages over grid cells and fluxes are computed on the respective cell interfaces. The grid is staggered (Arakawa & Lamb, 1977) so that the velocity components are defined at the same points as the corresponding fluxes of scalar quantities. PinCFlow.jl operates in a vertically stretched terrain-following coordinate system based on Gal-Chen and Somerville (1975a), Gal-Chen and Somerville (1975b) and Clark (1977).
The Lagrangian gravity-wave parameterization MS-GWaM (Multi-Scale Gravity-Wave Model) is interactively coupled to the dynamical core of PinCFlow.jl, so that unresolved gravity waves may be parameterized in a manner that accounts for transient wave-mean-flow interaction and horizontal wave propagation. The resolved fields are updated with tendencies computed by MS-GWaM at the beginning of every time step. A description of the theory behind MS-GWaM can be found in Achatz et al. (2017) and Achatz et al. (2023). For a numerical perspective and more information on the development, see Muraschko et al. (2014), Boeloeni et al. (2016), Wilhelm et al. (2018), Wei et al. (2019) and Jochum et al. (2025).
User guide
Installation
To install PinCFlow.jl, first make sure you have installed Julia. You can then run
julia --project -e 'using Pkg; Pkg.add("PinCFlow")'to add PinCFlow.jl to your current project environment.
Running the model
As a minimal example, the script
using PinCFlow
integrate(Namelists())runs PinCFlow.jl in its default configuration, if executed with
julia --project script.jlin your project's directory. This simulation will finish comparatively quickly and won't produce particularly interesting results, since PinCFlow.jl simply initializes a $1 \times 1 \times 1 \ \mathrm{km^3}$ isothermal atmosphere at rest with a single grid cell and integrates the pseudo-incompressible equations over one hour. A more complex configuration can be set up by providing namelists with changed parameters. This is illustrated in PinCFlow.jl's example scripts. To run them, we recommend setting up an examples project by executing
julia --project=examples -e 'using Pkg; Pkg.add(["CairoMakie", "HDF5", "HDF5_jll", "MPI", "MPICH_jll", "MPIPreferences", "PinCFlow", "Revise"])'Having done this, you can easily run any of the example scripts without needing to worry about extra packages that you may need. For instance, executing the script
# examples/scripts/periodic_hill.jl
using Pkg
Pkg.activate("examples")
using MPI
using HDF5
using CairoMakie
using Revise
using PinCFlow
npx = length(ARGS) >= 1 ? parse(Int, ARGS[1]) : 1
npz = length(ARGS) >= 2 ? parse(Int, ARGS[2]) : 1
h0 = 500.0
l0 = 10000.0
lz = 20000.0
zr = 10000.0
atmosphere = AtmosphereNamelist(;
model = Boussinesq(),
background = StableStratification(),
coriolis_frequency = 0.0,
initial_u = (x, y, z) -> 10.0,
)
domain = DomainNamelist(; x_size = 40, z_size = 40, lx = 20000.0, lz, npx, npz)
grid = GridNamelist(;
resolved_topography = (x, y) -> h0 / 2 * (1 + cos(pi / l0 * x)),
)
output =
OutputNamelist(; output_variables = (:w,), output_file = "periodic_hill.h5")
sponge = SpongeNamelist(;
rhs_sponge = (x, y, z, t, dt) ->
z >= zr ? sin(pi / 2 * (z - zr) / (lz - zr))^2 / dt : 0.0,
)
integrate(Namelists(; atmosphere, domain, grid, output, sponge))
if MPI.Comm_rank(MPI.COMM_WORLD) == 0
h5open("periodic_hill.h5") do data
plot_output(
"examples/results/periodic_hill.svg",
data,
("w", 1, 1, 1, 2);
)
return
end
end
runs a 2D simulation with an initial wind of $10 \ \mathrm{m \ s^{- 1}}$ that generates a mountain wave above a periodic hill and visualizes the results.
PinCFlow.jl uses parallel HDF5 to write simulation data. By default, the path to the output file is pincflow_output.h5. This may be changed by setting the parameter output_file of the namelist output accordingly (as illustrated above). The dimensions of most output fields are (in order) $\widehat{x}$ (zonal axis), $\widehat{y}$ (meridional axis), $\widehat{z}$ (axis orthogonal to the vertical coordinate surfaces) and $t$ (time). Ray-volume-property fields differ slightly in that they have an additional (spectral) dimension in front and a vertical dimension that includes the first ghost layer below the surface. To specify which fields are to be written, set the parameters output_variables, save_ray_volumes and prepare_restart of the namelist output accordingly (more details are given in the "Reference" section of the documentation).
For the visualization of simulation results, we recommend using Makie.jl with the CairoMakie backend. PinCFlow.jl has an extension which exports a few convenience functions if CairoMakie is loaded. This is utilized in the above script, yielding a plot of the vertical wind at the end of the simulation (see below). You can find more examples on the "Examples" page of the documentation. A description of all namelists and their parameters is provided in the "Reference" section.
If you want to run PinCFlow.jl in parallel, make sure you are using the correct backends for MPI.jl and HDF5.jl. By default, the two packages use JLL backends that have been automatically installed. If you want to keep this setting, you only need to make sure to use the correct MPI binary (specifically not that of a default MPI installation on your system). For example, with
mpiexec=$(julia --project=examples -e 'using MPICH_jll; println(MPICH_jll.mpiexec_path)')
${mpiexec} -n 9 julia examples/scripts/periodic_hill.jl 3 3you can run the above simulation in 9 MPI processes. Note that by passing extra arguments to the script, you set the parameters npx and npz of the namelist domain, which represent the number of MPI processes in $\widehat{x}$ and $\widehat{z}$. Their product must be equal to the total number of processes, otherwise PinCFlow.jl will throw an error.
However, if you plan to run PinCFlow.jl on a cluster, you may want to consider using a provided MPI installation as backend. In that case, the MPI preferences need to be updated accordingly and the HDF5 backend has to be set to a library that has been installed with parallel support, using the chosen MPI installation. This can be done by running
julia --project=examples -e 'using MPIPreferences; MPIPreferences.use_system_binary(; library_names = ["/path/to/mpi/library/"])'
julia --project=examples -e 'using HDF5; HDF5.API.set_libraries!("/path/to/libhdf5.so", "/path/to/libhdf5_hl.so")'with the paths set appropriately (more details can be found in the documentations of MPI.jl and HDF5.jl). Note that this configuration will be saved in examples/LocalPreferences.toml, so that the new backends will be used by all future scripts run in the examples project. By running
julia --project=examples -e 'using MPIPreferences; MPIPreferences.use_jll_binary()'
julia --project=examples -e 'using HDF5; HDF5.API.set_libraries!()'you can restore the default backends. Having configured MPI.jl and HDF5.jl to use installations on your system, you can run
mpiexec -n 16 julia examples/scripts/periodic_hill.jl 4 4with mpiexec being your chosen system binary. For users who would like to run PinCFlow.jl on Goethe or Levante, shell-script examples are provided in the folder examples/scripts of the repository.
List of publications
Initial flow solver: Rieper et al. (2013)
Initial gravity-wave scheme: Muraschko et al. (2014)
Gravity-wave breaking scheme: Boeloeni et al. (2016)
Gravity-wave theory: Achatz et al. (2017)
Coupling of the flow solver and gravity-wave scheme: Wilhelm et al. (2018)
Horizontal propagation and direct approach in the gravity-wave scheme: Wei et al. (2019)
Semi-implicit time scheme: Schmid et al. (2021)
Extended gravity-wave theory: Achatz et al. (2023)
Terrain-following coordinates & orographic source: Jochum et al. (2025)