Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

The purpose of this page is to document the procedure for adding support for new atmosphere grids. The process should be the same for new uniform resolutions as well as for new regionally-refined meshes, although some settings will need to be changed for new regionally-refined mesh configurations. This page is a work in progress, and will be updated as this process is refined and (eventually) made more automated. This documentation is an update of a document written by Mark Taylor and Colin Zarzycki, available as a Google Doc here.

...

Mapping files used by the coupler for mapping fluxes between SE "np4" and FV grids should be generated with TempestRemap and only need the exodus grid description (which provides the locations of the corners of the quadrilaterals that form the elements of the cube-sphere mesh) generated in Step 1 above. However, a handful of pre- and post-processing tools require a finite-volume equivalent of the spectral element grid (these tools include the surface roughness calculation in the land tool cube_to_target, ESMF mapping files used for interpolating land surface data to target grid, and post-processing regional and subgrid remapping tools). We refer to this finite-volume description of the SE grid as the "dual grid" to the SE grid (see page describing atmosphere grids in more detail here).     

The dual grid is generated using separate tools that draw polygons around the GLL nodes, and optionally optimize the area contained within each polygon to match the GLL weights. The resulting grid is saved in netCDF files that contain the grid point locations, area weights, and areas associated with each nodal value. In the SE dycore, this data depends on the reference element map (how the finite elements are mapped into the unit square reference element).  The SE dycore supports two such reference element maps: the older "gnomonic equal angle" map, and an "element-local" map.  RRM grids are required to use the "element-local" map.   Cubed-sphere grids can use either map (chosen via namelist option), with the existing compsets in E3SM v1 all using the gnomonic equal area map. For E3SM v2, all new grids (cubed-sphere and RRM) should use the element-local map. 

There are three separate codes that can create a dual grid from our SE grids:

  1. Matlab code.  This code produces the best SCRIP files, with nice (non-starlike) control volumes around each GLL node, with exact areas (spherical area = GLL weight).   But the serial code is painfully slow and thus is only practical for grids with up to 100,000 elements.  This code only supports the element-local map.  Noel Keen profiled the matlab script and found that at least for ne30 case, ~90% of the time is spent in computing the area of a polygon which is called millions of times. There may be some simple things to try to improve outside of rewriting.

  2. Fortran code.   This utility is included with standalone HOMME. It runs quickly, in parallel, and produces exact GLL areas for cubed-sphere meshes.  It supports both element maps via namelist options.  But for RRM meshes it produces suboptimal control volumes (including star shaped control volumes) with only approximate areas (spherical area ≠ GLL weight). The resulting RRM SCRIP file is thus not acceptable for generating mapping files for coupled simulations, but they should be fine for generating surface roughness fields.  

  3. NCAR utility.  Will update when we find out more about this.  

For the majority of use cases, optimizing the area and shape of each polygon in the dual grid is probably not necessarily, and a simple (and fast) approach can be taken using the Fortran code (plus some NCL scripts), which is packaged with HOMME. For conservative remapping, optimal areas that exactly match the GLL weights probably is required, and the slower Matlab code will most likely need to be used (NOTE: we plan to rewrite the Matlab code in a compiled language to work with higher resolution grids, but for the time being we will work with the less accurate Fortran/NCL codes for the few use cases we still need the SCRIP files for).

The _latlon file is a separate grid descriptor file that contains just a simple list of coordinates of all GLL nodes and is easy to produce. Either the Fortran+NCL or the Matlab utilities can produce this file.

The Fortran Code 

To run the fortran code on a E3SM supported platform:  (updated 2020/5/25 with updated tool from  https://github.com/E3SM-Project/E3SM/pull/3593 )

  1. Be sure your environment matches the software environment loaded by E3SM by executing the output of this command:   e3sm/cime/scripts/Tools/get_case_env

  2. Use cmake to configure and compile standalone HOMME.  On a supported platform with the CIME environement, this should work out-of-the-box.  See e3sm/components/homme/README.cmake

  3. compile the HOMME tool utility: 

    1. cd /path/to/workingdir 

    2. make -j4 homme_tool

    3. executable:   /path/to/workingdir/src/tool/homme_tool

  4. Edit e3sm/components/homme/test/tool/namelist/template.nl and specify the grid resolution or RRM file

    1. For ne512, this would be ne = 512. For RRM grids, leave ne = 0, but will need to edit where the exodus grid file comes from

    2. for non-RRM grids using the older E3SM v1 dycore, add cubed_sphere_map=0 to template.nl

  5. See e3sm/components/homme/test/tool/test.job for examples of how to run the script and then use an NCL utilities to process the tool output into SCRIP and latlon formats.  

Specific details for running at NERSC on Cori(knl):

  1. Create a batch script hange "account" in the sbatch directives at the top of the script. For example, set #SBATCH --account=e3sm

  2. cmake -C /path/to/e3sm/components/homme/cmake/machineFiles/cori-knl.cmake  -DPREQX_NP=4 /path/to/workingdir

  3. Make sure a working NCL is in your PATH. On Cori, add the following to the script: module load ncl.

With these changes, run makegrid.job TWICE to build and then run the code. The first pass builds HOMME, and the second pass runs HOMME and the NCL utilities. In a HPC environment, the second pass needs to submitted to the batch queue via sbatch -C knl makegrid.job. This will create the grid descriptor files in ${HOME}/scratch1/preqx/template.

The Matlab Code

The tool chain described below only supports the element-local map. As noted above, this method produces the highest-quality output but is extremely slow (days to weeks) to run. Our preferred mapping algorithms (Transition to TempestRemap for Atmosphere grids) also only support the element-local map.   

Creating the grid descriptor files for RRM grids and cubed-sphere grids using the element local map is accomplished using a matlab code written for this process, available in the PreAndPostProcessingScripts repo. First, checkout the code:

Code Block
# Checkout code
git clone git@github.com:E3SM-Project/PreAndPostProcessingScripts.git

# Set variable so we can refer to this directly below
PreAndPostProcessingScripts=${PWD}/PreAndPostProcessingScripts

Now we can run the code by executing the run_dualgrid.sh script from the ${PreAndPostProcessingScripts}/regridding/spectral_elements_grid_utilities/compute_dualgrid directory. Note that a working Matlab environment is required to run the code. On NERSC, this can be accomplished by simply executing module load matlab before running the run_dualgrid.sh script.

Code Block
grid_name=ne4
mesh_file=${output_root}/ne4.g

module load matlab
cd ${PreAndPostProcessingScripts}/regridding/spectral_elements_grid_utilities/compute_dualgrid
./run_dualgrid.sh ${grid_name} ${mesh_file}

This will create two files in the compute_dualgrid directory: a "latlon" and a "scrip" grid descriptor file. Move these to wherever you want to keep your grid files:

...

Generate SCRIP “dual grid” with homme_tool:

  1. Be sure your environment matches the software environment loaded by E3SM by executing the output of this command:   e3sm/cime/scripts/Tools/get_case_env

  2. Use cmake to configure and compile standalone HOMME.  On a supported platform with the CIME environement, this should work out-of-the-box.  See e3sm/components/homme/README.cmake

  3. compile the HOMME tool utility: 

    1. cd /path/to/workingdir 

    2. make -j4 homme_tool

    3. executable:   /path/to/workingdir/src/tool/homme_tool

  4. Edit e3sm/components/homme/test/tool/namelist/template.nl and specify the grid resolution or RRM file

    1. For ne512, this would be ne = 512. For RRM grids, leave ne = 0, but will need to edit where the exodus grid file comes from

    2. for non-RRM grids using the older E3SM v1 dycore, add cubed_sphere_map=0 to template.nl

  5. See e3sm/components/homme/test/tool/test.job for examples of how to run the script and then use an NCL utilities to process the tool output into SCRIP and latlon formats.  

Specific details for running at NERSC on Cori(knl):

  1. Create a batch script hange "account" in the sbatch directives at the top of the script. For example, set #SBATCH --account=e3sm

  2. cmake -C /path/to/e3sm/components/homme/cmake/machineFiles/cori-knl.cmake  -DPREQX_NP=4 /path/to/workingdir

  3. Make sure a working NCL is in your PATH. On Cori, add the following to the script: module load ncl.

3. Generate mapping files

...

...