Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.



Warning
iconfalse

  NERSC Directory Change Notice  

Due to project's name change at NERSC from 'ACME' to 'E3SM' and NERSC’s file system update, the directory  '/project/projectdirs/acme/' is now '/cfs/cdirs/e3sm'

...

TempestRemap needs to be built from source from the Github repository. This is straightforward on Cori and Edison. Note that a parallel version of netCDF should be used. This can be accomplished on cori/edison be executing module swap cray-netcdf cray-netcdf-hdf5parallel or by sourcing an env_mach_specific.sh from a working case on the target machine before building. Then, 

...

The Exodus file contains only information about the position of the spectral element on the sphere. For SE aware utilities such as TempestRemap, they can use the polynomial order and the reference element map to fill in necessary data such as the locations of the nodal GLL points. For non-SE aware utilities, we need additional meta data, described in the next section.   

...

2A. Generate "dual grid" mesh files

...

for E3SM v2 "pg2" grids 

Requirements:

  • exodus mesh fileMatlab or Fortran utility.file
  • TempestRemap

In E3SM v2, we will be switching to running physics on a FV pg2 grid and mapping files used by the coupler for mapping fluxes between components will be generated with TempestRemap and only need the exodus grid description (which provides the locations of the corners of the quadrilaterals that form the elements of the cube-sphere mesh) generated in Step 1 above. However, a handful of pre- and post-processing tools require a finite-volume equivalent of the spectral element grid (these tools include the surface roughness calculation in the land tool cube_to_target, ESMF mapping files used for interpolating land surface data to target grid, and post-processing regional and subgrid remapping tools). We refer to this finite-volume description of the SE grid as the "dual grid" to the SE grid (see page describing atmosphere grids in more detail here).     

The dual grid is generated using separate tools that draw polygons around the GLL nodes, and optionally optimize the area contained within each polygon to match the GLL weights. The resulting grid is saved in netCDF files that contain the grid point locations, area weights, and areas associated with each nodal value. In the SE dycore, this data depends on the reference element map (how the finite elements are mapped into the unit square reference element).  The SE dycore supports two such reference element maps: the older "gnomonic equal angle" map, and an "element-local" map.  RRM grids are required to use the "element-local" map.   Cubed-sphere grids can use either map (chosen via namelist option), with the existing compsets in E3SM v1 all using the gnomonic equal area map. For E3SM v2, all new grids (cubed-sphere and RRM) should use the element-local map. 

There are three separate between the atmosphere and other components will be FV to FV type maps using the pg2 grids.  These can be generated by TempestRemap directly from the exodus mesh file:

Code Block
${tempest_root}/bin/GenerateVolumetricMesh --in ne4.g --out ne4pg2.g --np 2 --uniform                                  
${tempest_root}/bin/ConvertExodusToSCRIP --in ne4pg2.g --out ne4pg2.scrip.nc                     


2B. Generate "dual grid" mesh files (SCRIP and lat/lon format) for E3SM v1 "np4" GLL grids

Requirements:

  • exodus mesh file
  • Matlab or Fortran utility.

Note: in E3SM v2, we will be switching to running physics on a FV pg2 grid and mapping files between the atmosphere and other components will be FV to FV type maps using the pg2 grids.   The spectral element "np4" grid is still used internally by the dynamics and for initial conditions, so the metadata described in this section is still needed for some analysis and preparation of initial conditions.   

Mapping files used by the coupler for mapping fluxes between SE "np4" and FV grids should be generated with TempestRemap and only need the exodus grid description (which provides the locations of the corners of the quadrilaterals that form the elements of the cube-sphere mesh) generated in Step 1 above. However, a handful of pre- and post-processing tools require a finite-volume equivalent of the spectral element grid (these tools include the surface roughness calculation in the land tool cube_to_target, ESMF mapping files used for interpolating land surface data to target grid, and post-processing regional and subgrid remapping tools). We refer to this finite-volume description of the SE grid as the "dual grid" to the SE grid (see page describing atmosphere grids in more detail here).     

The dual grid is generated using separate tools that draw polygons around the GLL nodes, and optionally optimize the area contained within each polygon to match the GLL weights. The resulting grid is saved in netCDF files that contain the grid point locations, area weights, and areas associated with each nodal value. In the SE dycore, this data depends on the reference element map (how the finite elements are mapped into the unit square reference element).  The SE dycore supports two such reference element maps: the older "gnomonic equal angle" map, and an "element-local" map.  RRM grids are required to use the "element-local" map.   Cubed-sphere grids can use either map (chosen via namelist option), with the existing compsets in E3SM v1 all using the gnomonic equal area map. For E3SM v2, all new grids (cubed-sphere and RRM) should use the element-local map. 

There are three separate codes that can create a dual grid from our SE grids:

  1. Matlab code.  This code produces the best SCRIP files, with nice (non-starlike) control volumes around each GLL node, with exact areas (spherical area = GLL weight).   But the serial code is painfully slow and thus is only practical for grids with up to 100,000 elements.  This code only supports the element-local map.  Noel Keen profiled the matlab script and found that at least for ne30 case, ~90% of the time is spent in computing the area of a polygon which is called millions of times. There may be some simple things to try to improve outside of rewriting.
  2. Fortran code.   This utility is included with standalone HOMME. It runs quickly, in parallel, and produces exact GLL areas for cubed-sphere meshes.  It supports both element maps via namelist options.  But for RRM meshes it produces suboptimal control volumes (including star shaped control volumes) with only approximate areas (spherical area ≠ GLL weight). The resulting RRM SCRIP file is thus not acceptable for generating mapping files for coupled simulations, but they should be fine for generating surface roughness fields.  
  3. NCAR utility.  Will update when we find out more about this.  

For the majority of use cases, optimizing the area and shape of each polygon in the dual grid is probably not necessarily, and a simple (and fast) approach can be taken using the Fortran code (plus some NCL scripts), which is packaged with HOMME. For conservative remapping, optimal areas that exactly match the GLL weights probably is required, and the slower Matlab code will most likely need to be used (NOTE: we plan to rewrite the Matlab code in a compiled language to work with higher resolution grids, but for the time being we will work with the less accurate Fortran/NCL codes for the few use cases we still need the SCRIP files for).

The _latlon file is a separate grid descriptor file that contains just a simple list of coordinates of all GLL nodes and is easy to produce. Either the Fortran+NCL or the Matlab utilities can produce this file.

The Fortran Code (noting that Noel Keen been working on making this step more smooth/automated and will have an update soon)

To run the fortran code on a E3SM supported platform:  

  1. Be sure your environment matches the software environment loaded by E3SM. This can be accomplished by sourcing either the .env_mach_specific.sh or .env_mach_specific.csh (depending on your shell) script from a working E3SM case directory, or by running cime/tools/configure --macros-format=Makefile and sourcing the .env_mach_specific.sh or .env_mach_specific.csh files that are written.
  2. Edit E3SM/components/homme/test/template/makegrid-cori.job
    1. grid resolution, supported machine file, element local or gnomonic map, number of nodes to use, queue, etc.
  3. Run the script.  the first time it is run it will configure (via cmake) and build the code.  The second run will run the fortran and then run some NCL utilities to convert the output into the correct format (producing a latlon and SCRIP file)

Specific changes to makegrid.job needed for running at NERSC on Cori(knl):

  1. Change "account" in the sbatch directives at the top of the script. For example, set #SBATCH --account=acme.
  2. Source a working .env_mach_specific.csh at the top of the script. I.e., need to add a line like `source /global/cscratch1/sd/bhillma/e3sm/cases/update-makegrid-tool.FC5AV1C-L.ne4_ne4.cori-knl/.env_mach_specific.csh` after the batch directives.
  3. Change "ne" to target resolution. For example for ne4 change set ne = 0 ; to set ne = 4 ;. For .  

For the majority of use cases, optimizing the area and shape of each polygon in the dual grid is probably not necessarily, and a simple (and fast) approach can be taken using the Fortran code (plus some NCL scripts), which is packaged with HOMME. For conservative remapping, optimal areas that exactly match the GLL weights probably is required, and the slower Matlab code will most likely need to be used (NOTE: we plan to rewrite the Matlab code in a compiled language to work with higher resolution grids, but for the time being we will work with the less accurate Fortran/NCL codes for the few use cases we still need the SCRIP files for).

The _latlon file is a separate grid descriptor file that contains just a simple list of coordinates of all GLL nodes and is easy to produce. Either the Fortran+NCL or the Matlab utilities can produce this file.

The Fortran Code 

To run the fortran code on a E3SM supported platform:  (updated 2020/5/25 with updated tool from  https://github.com/E3SM-Project/E3SM/pull/3593 )

  1. Be sure your environment matches the software environment loaded by E3SM by executing the output of this command:   e3sm/cime/scripts/Tools/get_case_env
  2. Use cmake to configure and compile standalone HOMME.  On a supported platform with the CIME environement, this should work out-of-the-box.  See e3sm/cime/components/homme/README.cmake
  3. compile the HOMME tool utility: 
    1. cd /path/to/workingdir 
    2. make -j4 homme_tool
    3. executable:   /path/to/workingdir/src/tool/homme_tool
  4. Edit e3sm/components/homme/test/tool/namelist/template.nl and specify the grid resolution or RRM file
    1. For ne512, this would be set ne = 512. For RRM grids, leave ne = 0, but will need to edit where the exodus grid file comes
    from (TODO: add documentation on this).Change "MACH": set MACH = $HOMME
    1. from
    2. for non-RRM grids using the older E3SM v1 dycore, add cubed_sphere_map=0 to template.nl
  5. See e3sm/components/homme/test/tool/test.job for examples of how to run the script and then use an NCL utilities to process the tool output into SCRIP and latlon formats.  

Specific details for running at NERSC on Cori(knl):

  1. Create a batch script hange "account" in the sbatch directives at the top of the script. For example, set #SBATCH --account=e3sm
  2. cmake -C /path/to/e3sm/components/homme/cmake/machineFiles/cori-knl.cmakecmake  -DPREQX_NP=4 /path/to/workingdir
  3. Make sure a working NCL is in your PATH. On Cori, add the following to the script: module load ncl.

...

In order to pass data between different components at runtime, a set of mapping files between each component is generated offline. These mapping files will also be used in Step 4 below (generating domain files).

For "pg2" grids used in E3SM v2, the recommended mapping files are slightly different and still a work in progress.  See Transition to TempestRemap for Atmosphere grids

TempestRemap and ESMF are the backends that generate the mapping weights, but this is all nicely encapsulated using ncremap. Tempest is the preferred method for creating mapping files.     The ncremap function, calling TempestRemap or ESMF, will decide which of these two tools to use based on the atmospheric input file.  If the *.scrip file was used, then ESMF will be called.  If the *.g file was used, then TempestRemap will be called.   The ESMF tools are adequate for making atmosphere-only-type component sets for E3SM, but this tool is less conservative than TempestRemap.   If you are making grids for a coupled run, then TempestRemap should be used.  

...

...