Special Considerations for FV Physics Grids
This page details exceptions to the step-by-step grid setup instructions when using the finite volume (FV) physics grid (a.k.a. "physgrid", pg2, pg3). When generating a new model grid the workflow presented in the step-by-step guide on the parent page is still relevant, but there are several deviations from those steps that are documented here. If the physgrid becomes the default configuration for E3SM then we will likely just merge these notes into the parent page.
Physgrid Description
The FV physics grid (physgrid) is constructed by creating equal subdivisions of the quadrilateral elements of the cube sphere, typically 2x2 or 3x3. This differs from the "spectral element" method of the GLL grid, in which points within the cube sphere elements represent nodes of continuous polynomial basis functions. When running the model with the physgrid, the dynamics calculations (i.e. advection) are still solved on the GLL grid, but the state is mapped to the physgrid for physics calculations (i.e. clouds and radiation) and the physics tendencies are then mapped back to the GLL grid. The model output is mainly on the physgrid, but certain quantities can also be output on the GLL grid for specialized diagnostics.
Generating New Grid Files
TempestRemap is still used for generating new grid "exodus" file, but an extra step is needed for using the physgrid. For a typical GLL grid the exodus mesh file defines the vertices of the elements, without any information about the internal node structure defined by the nodes and basis functions. For example, if we generate an exodus file for the "ne30" grid we get a cube sphere grid where each cube face has been divided into 30x30 elements. If we wish to use this for mapping between a GLL grid and some other grid with TempestRemap we would need to indicate the number of nodes used to define basis function nodes within each elements (this is typically 4x4 or "np4").
In order to adapt this to the physgrid we need to use an additional command to subdivide these elements into the number of FV cells we want. Similar to the example given above is that when mapping between grids we would need to indicate to TempestRemap that these "elements" are actually finite volume cells with a special flag (see mapping section below).
So if we want an ne30pg2 file, and we initially created an ne30 exodus file with this command:
${tempest_root}/bin/GenerateCSMesh --alt --res 30 --file ${output_root}/ne30.g
Then we would simply use the following command to evenly subdivide the elements into 9 cells:
${tempest_root}/bin/
GenerateVolumetricMesh --in ${output_root}/
ne30.g --out ${output_root}/
ne30pg2.g --np 2 --uniform
Mapping Files
The "ncremap -P mwf" procedure encapsulates several commands to generate all the required map files for a run without the physgrid, but since these commands are specific to using the spectral element grid, we need different commands when using the physgrid. We can implement a comparable procedure in ncremap if we want by submitting a PR to the NCO repository.
The example below shows the commands needed to generate all mapping files for a tri-grid configuration with the atmosphere on the ne30pg2 grid. Note that this example of wrapping TempestRemap
with ncremap
assumes the mapping algorithm for TempestRemap
. For more control over this, and to use the latest recommended maps for mapping between each component, see Recommended Mapping Procedures for E3SM Atmosphere Grids:
atm_grid_file=ne30pg2.g
ocn_grid_file=ocean.oEC60to30v3.scrip.181106.nc
lnd_grid_file=SCRIPgrid_0.5x0.5_nomask_c110308.nc
atm_name=ne30pg2
ocn_name=oEC60to30v3
lnd_name=r05
alg_name=mono
map_opts='--in_type fv --in_np 1 --out_type fv --out_np 1 --out_format Classic'
date=200110
ncremap -a tempest --src_grd=$ocn_grid_file --dst_grd=$atm_grid_file -m map_${ocn_name}_to_${atm_name}_${alg_name}.${date}.nc -W $map_opts
ncremap -a tempest --src_grd=$atm_grid_file --dst_grd=$ocn_grid_file -m map_${atm_name}_to_${ocn_name}_${alg_name}.${date}.nc
-W $map_
opts
--a2oncremap -a tempest --src_grd=$lnd_grid_file --dst_grd=$atm_grid_file -m map_${lnd_name}_to_${atm_name}_${alg_name}.${date}.nc
-W $map_
opts
ncremap -a tempest --src_grd=$atm_grid_file --dst_grd=$lnd_grid_file -m map_${atm_name}_to_${lnd_name}_${alg_name}.${date}.nc
-W $map_
opts
ncremap -a tempest --src_grd=$lnd_grid_file --dst_grd=$ocn_grid_file -m map_${lnd_name}_to_${ocn_name}_${alg_name}.${date}.nc
-W $map_
opts
--a2oncremap -a tempest --src_grd=$ocn_grid_file --dst_grd=$lnd_grid_file -m map_${ocn_name}_to_${lnd_name}_${alg_name}.${date}.nc
-W $map_
opts
The path to these files needs to be added to the appropriate section of cime/config/e3sm/config_grids.xml.
Domain Files
No change in the procedure for generating domain files except that the physgrid grid and mapping files described above need to be used because the physics grid is used when communicating with the coupler, which is primarily where the domain files are needed.
Generating a Scrip File
For mapping GLL grids without TempestRemap (or plotting GLL grid data) we need to "re-interpret" the representation of data on the GLL grid into a finite volume grid. Step #2 in the step-by-step guide discusses how to do this with the "dual grid" approach. It's worth noting that this is not a visually accurate representation of the data, but the area of the FV cells produced by the dual grid are consistent with the GLL weights so that spatial sums and averages can be computed in an intuitive way. Another caveat of the dual grid method is that the generation of the scrip grid description file requires an iterative process that can can take a very long time for large grids.
An advantage of the physgrid is that we don't have to worry about this nonsense because the data is naturally represented by finite volumes. There's a simple TempestRemap command that can quickly convert from exodus to scrip file types using the following command:
${tempest_root}/bin/
ConvertExodusToSCRIP --in ne30pg2.g --out
ne30pg2_scrip.nc
The resulting scrip file can be used for mapping with ESMF or plotting physgrid data on the native grid.
Topography
Several modifications to the topography generation are needed to support PG2 grids. Complete details are given in the V2 (and later) instructions under
Atmospheric Topography Generation
Regional Refinement
Regionally refined grids require no special steps when using the physgrid because the regional refinement happens at the element level, and the physgrid only changes the physics column arrangement within the element. Note however that the input namelist for homme_tool
will need to specify ne = 0
and mesh_file
for RRM grids.
Steps to Interpolate Between Grids
There's nothing special about these steps, just thought I'd post a basic example in case someone is needing to convert between grids for comparison purposes.
In this particular example I'm interpolating from ne30np4 to ne30pg3.
GenerateOverlapMesh --a ${${tempest_root}
//bin
_root}/ne30.g --b grid_data
${
ne30pg3.g --out
}/
_rootgrid_data
overlap_mesh.nc${
output_root}/tmp_grid_
GenerateOfflineMap --in_mesh ${tempest_root}
//bin
${
--out_mesh
}/ne30.g
_rootgrid_data
--ov_mesh ${
ne30pg3.g
}/
_rootgrid_data
${
}/
_rootgrid_data
overlap_meshtmp_
.nc
--in_type cgll --in_np 4 --out_type fv --out_np 1 --out_double --mono --out_map
.nc${
}/
_rootgrid_data
mapping_weightstmp_
ncremap -4 -m
mapping_weights.nc ${input_file_name} ${
}/tmp_
_rootgrid_data
${output_file_name}