Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

In order to adapt this to the physgrid we need to use an additional command to subdivided subdivide these elements into the number of FV cells we want. Similar to the example given above is that when mapping between grids we would need to indicate to TempestRemap that these "elements" are actually finite volume cells with a special flag (see mapping section below).

So if we want an ne30pg3 ne30pg2 file, and we initially created an ne30 exodus files file with this command:

${tempest_root}/bin/GenerateCSMesh --alt --res 30 --file ${output_root}/ne30.g

...

${tempest_root}/bin/GenerateVolumetricMesh --in ${output_root}/ne30.g --out ${output_root}/ne30pg3ne30pg2.g --np 3 --uniform2 --uniform

Mapping Files

The "ncremap -P mwf" procedure encapsulates several commands to generate all the required map files for a run without the physgrid, but since these commands are specific to using the spectral element grid, we need different commands when using the physgrid. We can implement a comparable procedure in ncremap if we want by submitting a PR to the NCO repository.

The example below shows the commands needed to generate all mapping files for a tri-grid configuration with the atmosphere on the ne30pg2 grid. Note that this example of wrapping TempestRemap with ncremap assumes the mapping algorithm for TempestRemap. For more control over this, and to use the latest recommended maps for mapping between each component, see Recommended Mapping Procedures for E3SM Atmosphere Grids:

atm_grid_file=ne30pg2.g
ocn_grid_file=ocean.oEC60to30v3.scrip.181106.nc
lnd_grid_file=SCRIPgrid_0.5x0.5_nomask_c110308.nc
atm_name=ne30pg2
ocn_name=oEC60to30v3
lnd_name=r05
alg_name=mono

map_opts='--in_type fv --in_np 1 --out_type fv --out_np 1 --out_format Classic'
date=200110
ncremap -a tempest --src_grd=$ocn_grid_file --dst_grd=$atm_grid_file -m map_${ocn_name}_to_${atm_name}_${alg_name}.${date}.nc -W $map_opts
ncremap -a tempest --src_grd=$atm_grid_file --dst_grd=$ocn_grid_file -m map_${atm_name}_to_${ocn_name}_${alg_name}.${date}.nc -W $map_opts --a2o
ncremap -a tempest --src_grd=$lnd_grid_file --dst_grd=$atm_grid_file -m map_${lnd_name}_to_${atm_name}_${alg_name}.${date}.nc -W $map_opts 
ncremap -a tempest --src_grd=$atm_grid_file --dst_grd=$lnd_grid_file -m map_${atm_name}_to_${lnd_name}_${alg_name}.${date}.nc -W $map_opts 
ncremap -a tempest --src_grd=$lnd_grid_file --dst_grd=$ocn_grid_file -m map_${lnd_name}_to_${ocn_name}_${alg_name}.${date}.nc -W $map_opts --a2o
ncremap -a tempest --src_grd=$ocn_grid_file --dst_grd=$lnd_grid_file -m map_${ocn_name}_to_${lnd_name}_${alg_name}.${date}.nc -W $map_opts 


The path to these files needs to be added to the appropriate section of cime/config/e3sm/config_grids.xml.

Domain Files

No change in the procedure for generating domain files except that the physgrid grid and mapping files described above need to be used because the physics grid is used when communicating with the coupler, which is primarily where the domain files are needed. 

Generating a Scrip File

For mapping GLL grids without TempestRemap (or plotting GLL grid data) we need to "re-interpret" the representation of data on the GLL grid into a finite volume grid. Step #2 in the step-by-step guide discusses how to do this with the "dual grid" approach. It's worth noting that this is not a visually accurate representation of the data, but the area of the FV cells produced by the dual grid are consistent with the GLL weights so that spatial sums and averages can be computed in an intuitive way. Another caveat of the dual grid method is that the generation of the scrip grid description file requires an iterative process that can can take a very long time for large grids.

...

${tempest_root}/bin/ConvertExodusToSCRIP --in ne30pg3ne30pg2.g --out ne30pg3ne30pg2_scrip.nc

The resulting scrip file can be used for mapping with ESMF or plotting physgrid data on the native grid. 

Mapping Files

???

${tempest_root}/bin/GenerateOverlapMesh --a ${source_mesh_file} --b ${output_root}/ne30pg3.g --out overlap_mesh.nc

???

${tempest_root}/bin/GenerateOfflineMap --in_mesh ${output_root}/ne30pg3.g --out_mesh ${source_mesh_file}  --ov_mesh overlap_mesh.nc --out_map ${output_map_file} --in_type fv --in_np 1 --out_type fv --out_np 1 --out_double --mono --volumetric

???

Domain Files

No change in the procedure for generating domain files except that the physgrid grid and mapping files described above need to be used because the physics grid is used when communicating with the coupler, which is primarily where the domain files are needed. 

Topography

???

Topography

Several modifications to the topography generation are needed to support PG2 grids.  Complete details are given in the V2 (and later) instructions under

Atmospheric Topography Generation

Regional Refinement

Regionally refined grids require no special steps when using the physgrid because the regional refinement happens at the element level, and the physgrid only changes the physics column arrangement within the element.  Note however that the input namelist for homme_tool will need to specify ne = 0 and mesh_file for RRM grids.

Steps to Interpolate

...

Between Grids

There's nothing special about these steps, just thought I'd post them a basic example in case someone is looking for a basic exampleneeding to convert between grids for comparison purposes.

In this particular example I'm interpolating from ne30np4 to ne30pg3 using the default initial condition file for an aquaplanet case.

${tempest_root}/bin/GenerateOverlapMesh --a ${grid_data_root}/ne30.g --b ${grid_data_root}/ne30pg3.g --out ${grid_output_root}/tmp_overlap_mesh.nc

...

ncremap -4 -m ${grid_data_root}/tmp_mapping_weights.nc /project/projectdirs/acme/inputdata/atm/cam/inic/homme/cami_aquaplanet_ne30np4_L72_c190215.nc ${data_output_root}/cami_aquaplanet_ne30pg3_L72.nc ${input_file_name} ${output_file_name}