Creating mapping and domain files

For detailed review of steps needed to create new grids in ACME see CAM-SE Variable Resolution Grid Generation, CIME documentation, and CESM 1.2 manual and (most current) Running E3SM on New Atmosphere Grids .

This is a summary of steps required to create mapping/weighting/domain files for the model if one starts with an atmosphere, HOMME, mesh. We follow instructions in ACME and HOMME files with some modifications and clarifications.

These notes contain instructions for a particular situation, modifying input files for compset FC5AV1F with resolution conusx4v1_conusx4v1 . Since new algorithm for weights is being introduced  in Homme (for locally refined meshes, new weights are now consistent with an element area), mapping files and domain files for refined meshes need to be replaced. Turns out that only one homme locally refined mesh is used in config_grids.xml, mesh conusx4v1.g . The process of creating new corresponding mapping files and domain files is described below.

There are a few types of grids/maps/files that are discussed below:

  • HOMME mesh: A HOMME mesh is either built in HOMME internally, based on Number of Elements (NE) per cube edge (so there is no file that HOMME reads), or an Exodus *.g mesh that lists conforming quadrilaterals. See Atmosphere Grids and the very first plot for an example of a HOMME uniform mesh (quadrilaterals are blue, quadrature points, or degrees of freedom, are green). 
  • SCRIP file: A mesh that is required to couple HOMME to other parts of the model. It is a dual grid for the HOMME grid. It contains spherical polygons that are centered (in a general sense) at HOMME grid (quadrature) points. Its name usually contains suffix _scrip. See Atmosphere Grids for more information, including information on different options for dual grids, like dual grids constructed with chevrons and pentagons.
  • LATLON file: A file that contains latlon data of HOMME quadrature points/DOFs in a certain format. Its name usually contains suffix _latlon. See Atmosphere Grids for more information. This file is not needed to create domain files.
  • _aave file: A conservative mapping file from one SCRIP file to another with suffix _aave in its name. Needed to construct domain files.
  • _bilin file: A bilinear, not conservative, mapping file from one SCRIP file to another with suffix _bilin in its name.
  • Domain files: Add to this.

This is a chain of actions to create mapping/domain files:

HOMME mesh  --(Action 1)-->  SCRIP and LATLON files  --(Action 2)-->  _aave and _bilin files  --(Action 3)-->  domain files

  1. Action (1) has to be performed by either:

    A) HOMME template run and ncl scripts if the mesh is uniform (NE=4, NE=10, etc.). An example of a template run is in ACME HOMME test suite. The run does not need a mesh file, it only uses NE parameter. Once the template run generates an *.nc file, use it in scripts HOMME2META.ncl and HOMME2SCRIP.ncl to get files * and * . The ncl scripts are in ACME/components/homme/test/template/ .

    The only difference between template runs and usual HOMME runs is in which variables are recorded in output *nc files. Output variables are controlled by an input *nl file that HOMME takes in. In the template *nl file in HOMME test suite these vars are listed for output: 


    Note that soon 'phys_area' which is used for FVM transport in HOMME, will be removed from the code.

    Template runs need only 1 time step.

    B) A Matlab script if HOMME mesh is a refined mesh (NE=0). How to create a refined HOMME mesh is not a topic of this page. Refer to SquadGen utility as one of tools. Once a *.g file is created, the Matlab script creates a dual grid in * file and latlon information in * . The Matlab script is now checked out to ACME git repository ACME-Climate/PreAndPostProcessingScripts/spectral_elements_grid_utilities . The script that creates both files is dualgridgenerate.m . A script that performs some verification checks on the resulting mesh is geometric_statistics_twisting_center_location.m .

    Note that route A) won’t work with locally refined meshes since HOMME template runs do not handle such grids. Technically, route B) works for uniform meshes if they are recorded in *.g file. However, due to recent changes in HOMME for consistency of elements’ areas, one should be careful to use the same consistency method in HOMME during a simulation as in Matlab when constructing dual meshes. To keep it short, B) is not recommended for uniform meshes. 

    C) Action 1 and 2 can be done by Tempest Remap. Details will be added later.

  2. Action (2) maps HOMME grid from/to another grid (SCRIP file), records mappings in _aave*nc and _bilin*nc files, and is done by an (A) ESMF utility or (B) ncremap, see comment by Charlie Zender below. Most recent scripts for this action are in , the script is . If, for example, another grid is an ocean grid, then one should add ‘ --src_regional -i ‘ to the command line of the ESMF utility (as said in ) since ocean grids do not cover the whole globe. To create mapping files atmosphere->ocean, one can run a batch script with a command line similar to

    ./remap-ncl/ map_conusx4v1np4b_to_tx0.1v2

    It takes approx. 20 min on skybridge for these particular files and 32 cores. To change number of cores, modify makemap.job . To create mapping files ocean→atmosphere, add '--dst_regional -i' in makemap.job.

    By Charlie Zender:

    Action (2) can be performed by ncremap (documentation) which, among other capabilities, uses ESMF or Tempest to generate mapfiles between * and *.g grids. One benefit of ncremap is the improved provenance retained in the metadata of the generated maps. In theory, the following two ncremap calls could replace

    ncremap -w esmf -a conserve -s -g -m
    ncremap -w esmf -a bilinear -s -g -m

    (and dash-w tempest without any dash-a option would use Tempest for the maps). However, no one to my knowledge has yet used ncremap for the intermediate step (i.e., step 2) of making domain files (in contrast to end-product mapfiles, which is well-tested). I would be interested in any feedback from those who make domain files on the efficacy of using ncremap.

  3. Action (3) uses gen_domain utility from ACME/CESM. There are instructions to build it in ACME/cime/tools/mapping/gen_domain_files/README (ignore INSTALL, there is a typo). After gen_domain is built, the command is, for example,
    mpiexec -n 1 ~/ACME/cime/tools/mapping/gen_domain_files/gen_domain -m -o tx0.1v2 -l ne0np4_conus_x4v1_lowcon

    For domain files, one should find proper aliases (in our example aliases are tx0.1v2 and ne0np4_conus_x4v1_lowcon) for atmosphere and the other grid in ACME/cime/scripts/Tools/config_grid.xml . The resulting files are

Replacing new domain files only did not work for this particular compset. Due to errors in land init file (new domain files have different land mask points comparing to the old domain files), cold start run and interpolating land init file is needed. Details are below:

  1. Replace domain files, configure a test
    ~/ACME/cime/scripts/create_newcase -case $CASE -res conusx4v1_conusx4v1 -mach skybridge -compiler intel -compset FC5AV1C-04 -project NNN
    Change COLD START to ON in env_run.xml (can be verified in run folder in lnd_in that finidat='', that is, it is empty). Run for, say, 5 days.
  2. This will produce a restart file $ . Copy it to some DIR because it will be modified by an interpolating routine.
  3. Change to DIR.
  4. Build interpinic from ACME/components/clm/tools/clm4_5 . Older versions of the interpolation do not have dimension 'levgrdn', so, a newer build is required. Before building, set variables as below. I am not sure all of these options are crucial.

    export INC_NETCDF=${NETCDF_PATH}/include
    export LIB_NETCDF=$NETCDF_PATH/lib
    export USER_FC=ifort
    export SMP=TRUE
    export OMP_NUM_THREADS=8

    Now there is a problem with dependencies in source file directory, because one file is a C file. Run make once, disable making dependencies in makefile, remove line with she_isnan.o from dependencies file. Then make again except now the line to make executable will carry the same shc_isnan.o . Copypaste the command line for the executable, remove shc_isnan.o from it and execute. 

    If trying to run the executable from, say, a new window, set OMP_NUM_THREADS=8 again.

  5. Run the interpinic in DIR by 

    ACME/components/clm/tools/clm4_5/interpinic/interpinic -i /projects/ccsm/inputdata/lnd/clm2/initdata_map/ -o $

    where the '-i' file is the one that is used by the compset&resolution when the case is configured as is (not a cold start run) and can be found in lnd_in namelist, finidat variable. As it says in Using interpinic to interpolate initial conditions to different resolutions

    -i = Input filename to interpolate from
    -o = Output interpolated file, and starting template file
  6. Now we try to run with new land IC conditions: Set the case as before by running create_newcase, run cesm_setup, open user_nl_clm and insert 
    finidat = 'DIR/$' .
  7. Run $, verify that finidat is correctly set in lnd_in .
  8. This should be it.

More details:

  • Running Matlab can take a few hours (10+). One can turn off plots, insert ‘exit;’ at the end of dualgridgenerate.m and use command
    nohup matlab -nodisplay -nosplash -r dualgridgenerate > out.txt &
  • To build gen_domain on skybridge, file ACME/cime/machines/env_mach_specific.skybridge was modified to match modules and (p)netcdif paths with ACME/components/homme/cmake/machineFiles/skybridge.cmake. It also required module mkl/14.0 to run.
  • Running gen_domain in serial did not work on skybridge (segfault). It helped to allocate a node and run on 1 core with mpiexec.