The purpose of this page is to document the procedure for adding support for new atmosphere grids. The process should be the same for new uniform resolutions as well as for new regionally-refined meshes, although some settings will need to be changed for new regionally-refined mesh configurations. This page is a work in progress, and will be updated as this process is refined and (eventually) made more automated. This documentation is an update of a document written by Mark Taylor and Colin Zarzycki, available as a Google Doc here.
...
File | Tool | Type | Note |
---|---|---|---|
ncks, ncremap | NCO | ||
mapping files and mesh template files | TempestRemap | C++ | GenerateCSMesh: make cubed sphere Exodus (.g) files (spectral element "np" grid) |
RRM Exodus (.g) mesh files | SquadGen | C++ | |
topo files | homme_tool | Fortran | Included with E3SM. Should build and run on any system which can run the HOMME_P24.f19_g16_rx1.A test Generate (obsolete) SCRIP files for the spectral element "np" dual grid Used for topo smoothing for both FV "pg" and SE "np" grids. Can also do parallel interpolation from SE np4 grid to any SCRIP grid |
topo files | cube_to_target | Fortran | NCAR utility for generating unsmoothed topography from high-res USGF data, and generating surface roughness fields from smoothed topography |
mapping files | ESMF_Regridweightgen | Make FV->FV mapping files from SCRIP grid template files Only tool which supports the montone 2nd order "bilin" map | |
| CIME and ELM tools | ||
land surface dataset | mksurfdata.pl, mksurfdata_map, | Perl and Fortran | |
ELM initial condition | interpinic | Fortran | 4 options:
|
Archives of already-created grid and mapping files can be accessed at https://web.lcrc.anl.gov/public/e3sm/mapping/grids/ and https://web.lcrc.anl.gov/public/e3sm/mapping/maps/ . This page focuses on creating new versions of these files.
...
Generate SCRIP “dual grid” with homme_tool:
Be sure your environment matches the software environment loaded by E3SM by executing the output of this command: e3sm/cime/scripts/Tools/get_case_env
Use cmake to configure and compile standalone HOMME. On a supported platform with the CIME environement, this should work out-of-the-box. See e3sm/components/homme/README.cmake
compile the HOMME tool utility:
cd /path/to/workingdir
make -j4 homme_tool
executable: /path/to/workingdir/src/tool/homme_tool
Edit e3sm/components/homme/test/tool/namelist/template.nl and specify the grid resolution or RRM file
For ne512, this would be
ne = 512
. For RRM grids, leavene = 0
, but will need to edit where the exodus grid file comes fromfor non-RRM grids using the older E3SM v1 dycore, add cubed_sphere_map=0 to template.nl
See e3sm/components/homme/test/tool/test.job for examples of how to run the script and then use an NCL utilities to process the tool output into SCRIP and latlon formats.
Specific details for running at NERSC on Cori(knl):
Create a batch script hange "account" in the sbatch directives at the top of the script. For example, set
#SBATCH --account=e3sm
cmake -C /path/to/e3sm/components/homme/cmake/machineFiles/cori-knl.cmake -DPREQX_NP=4 /path/to/workingdir
Make sure a working NCL is in your PATH. On Cori, add the following to the script:
module load ncl
.
3. Generate mapping files
...
Code Block |
---|
#!/bin/bash # Setup environment source /global/cfscommon/cdirssoftware/e3sm/software/anaconda_envs/load_latest_e3sm_unified_pm-cpu.sh e3sm_root=${HOME}/codes/e3sm/branches/master # Build gen_domain tool gen_domain=${e3sm_root}/cime/tools/mapping/gen_domain_files/gen_domain cd `dirname ${gen_domain}`/src eval $(${e3sm_root}/cime/CIME/Tools/get_case_env) ${e3sm_root}/cime/CIME/scripts/configure --macros-format Makefile --mpilib mpi-serial gmake # Set paths to mapping files mapping_root="/global/homes/b/bhillma/cscratch/e3sm/grids/ne4" ocn_grid_name=oQU240 atm_grid_name=ne4np4 lnd_grid_name=${atm_grid_name} # run domain generation tool (from output directory) domain_root=${mapping_root} mkdir -p ${domain_root} && cd ${domain_root} for target_grid_name in ${lnd_grid_name} ${atm_grid_name}; do # Find conservative mapping files, use the latest file generated map_ocn_to_target=`ls ${mapping_root}/map_${ocn_grid_name}_to_${target_grid_name}_monotr.*.nc | tail -n1` # Run domain tool code ${gen_domain} -m ${map_ocn_to_target} -o ${ocn_grid_name} -l ${target_grid_name} done |
...
Run 10 days with very large se_nsplit and hypervis_subcycle options and create a new IC file (see INITHIST in CAM). For aqua planet, this step is often not necessary, but for simulations with topography this step is critical. One may also need to reduce dtime and increase the viscosity coefficients. In an extreme case, something like this might be needed:
run 5-10 days with dtime 3x smaller than the default (and viscosity coefficients 2x larger)
restarting from step a, run 5-10 days with dtime 3x smaller and default viscosity coefficients
restarting from step b, run 5-10 days with dtime 2x smaller and default viscosity coefficients
restarting from step c, run 5-10 days with default dtime and se_nsplit 2x larger
restarting from step d, run with all default parameters
Use this new IC file for all future runs below
First we determine a stable value of se_nsplit. To do this, we first ensure the viscosity timestep is not causing problems (this is especially true if nu_div > nu), so start with a large value of hypervis_subcycle, say hypervis_subcycle=20
Find the smallest value of se_nsplit for which the code is stable using 1 month runs. Start with se_nsplit that is used by the corresponding high resolution uniform grid (assuming they all have the same physics timestep, dtime)
One a stable value of se_nsplit has been found, decrease hypervis_subcycle until the smallest stable value is found. Note that these are not independent. You have to find the stable value of se_nsplit before finding the stable value of hypervis_subcycle.
Final step: The procedure outlined above can find timesteps that are borderline unstable, but don’t blow up do to various dissipation mechanisms in CAM. Hence it is a good idea to run 3 months, and look at the monthly mean OMEGA500 from the 3rd month. This field will be noisy, but there should not be any obvious grid artifacts. Weak instabilities can be masked by the large transients in flow snapshots, so it best to look at time averages.
Note that for simulations with topography, we often increase nu_div. This can trigger a restrictive CFL condition which requires reducing hypervis_subcycle.
...
land grid descriptor file in SCRIP format
ESMF_RegridWeightGen
geographic distribution for each land surface type along with grid descriptor files for each of those surface types
mkmapdata.sh (found in components/elm/tools/mkmapdata/)
mksurfdata.pl (found in components/clm/tools/clm4_5/mksurfdata_map/)
...
the steps below are for maint1-0 code base. Post-v1 release changes (to add phosphorus) broke existing land initial condition files (finidat) and may require changes to this methodology.
the focus here is on creating an fsurdat file in cases where land use land cover change (LULCC) does NOT change. Additional steps will be needed to create a transient LULCC file.
questions for the land team are in red
Steps:
Create mapping files for each land surface type if needed. An (older and deprecated) example of doing this can be found here. Updated instructions follow:
Obtain or generate a target grid file in SCRIP format. For these example, we will use a ne1024pg2 grid file, which we will need to create (note that most np4 grid files can be found within the inputdata repository, for example, the ne1024np4 grid file is at https://web.lcrc.anl.gov/public/e3sm/mapping/grids/ne1024np4_scrip_c20191023.nc). To generate the pg2 SCRIP file:
Code Block ${tempest_root}/bin/GenerateCSMesh --alt --res 1024 --file ne1024.g ${tempest_root}/bin/GenerateVolumetricMesh --in ne1024.g --out ne1024pg2.g --np 2 --uniform ${tempest_root}/bin/ConvertExodusToSCRIP --in ne1024pg2.g --out ne1024pg2_scrip.nc
Get list of input grid files for each land surface input data file. This is done by running the components/elm/tools/mkmapdata/mkmapdata.sh script in debug mode to output a list of needed files (along with the commands that will be used to generate each map file; also make sure GRIDFILE is set to the SCRIP file from the above step):
Code Block language bash cd ${e3sm_root}/components/elm/tools/mkmapdata ./mkmapdata.sh --gridfile ${GRIDFILE} --inputdata-path ${INPUTDATA_ROOT} --res ne1024pg2 --gridtype global --output-filetype 64bit_offset --debug -v --list
Download needed input grid files. The above command will output a list of needed files to
clm.input_data_list
. We need to download all of these before calling the script without the debug flag to actually perform the mapping. This is possible usingcheck_input_data
in CIME, but needs to be done from a dummy case directory. So, one can create a dummy case,cd
to that case, and then call./check_input_data --data-list-dir <path where mkmapdata was run from> --download
. However, this failed to connect to the CESM SVN server for me. So instead, I used the following one-off script:Code Block #!/bin/bash e3sm_inputdata_repository="https://web.lcrc.anl.gov/public/e3sm" cesm_inputdata_repository="https://svn-ccsm-inputdata.cgd.ucar.edu/trunk" inputdata_list=clm.input_data_list cat $inputdata_list | while read line; do localpath=`echo ${line} | sed 's:.* = \(.*\):\1:'` url1=${e3sm_inputdata_repository}/`echo ${line} | sed 's:.*\(inputdata/lnd/.*\):\1:'` url2=${cesm_inputdata_repository}/`echo ${line} | sed 's:.*\(inputdata/lnd/.*\):\1:'` if [ ! -f ${localpath} ]; then echo "${url1} -> ${localpath}" mkdir -p `dirname ${localpath}` cd `dirname ${localpath}` # Try to download using first URL, if that fails then use the second wget ${url1} || wget ${url2} else echo "${localpath} exists, skipping." fi done
Create mapping files. Should just be able to run the above
mkmapdata.sh
command without the–debug --list
flags. We need to append the--outputfile-type 64bit_offset
flag for our large files (no reason not to do this by default anyways):Code Block ./mkmapdata.sh --gridfile ${GRIDFILE} --inputdata-path ${INPUTDATA_ROOT} --res ne1024pg2 --gridtype global --output-filetype 64bit_offset -v
Compile surface dataset source code (NOTE:
${e3sm_root}/components/clm/tools/clm4_5/mksurfdata_map/src/Makefile.common
needs to be edited to build on most machines; this is fixed in https://github.com/E3SM-Project/E3SM/pull/2757):Code Block # Setup environment (should work on any E3SM-supported machine) eval $(${e3sm_root}/cime/CIME/Tools/get_case_env) ${e3sm_root}/cime/CIME/scripts/configure --macros-format Makefile --mpilib mpi-serial source .env_mach_specific.sh # Build mksurfdata_map cd ${e3sm_root}/components/elm/tools/mksurfdata_map/src INC_NETCDF="`nf-config --includedir`" \ LIB_NETCDF="`nc-config --libdir`" USER_FC="`nc-config --fc`" \ USER_LDFLAGS="`nf-config --flibs`" make
Run the mksurfdata.pl script in "debug" mode to generate the namelist (use year 2010 on ne120np4 grids as an example).
Code Block # For supported resolutions #(use year 2010 on ne120np4 grids as an example) cd $e3sm_dir/components/elm/tools/mksurfdata_map ./mksurfdata.pl -res ne120np4 -y 2010 -d -dinlc /global/cfs/cdirs/e3sm/inputdata -usr_mapdir /global/cfs/cdirs/e3sm/inputdata/lnd/clm2/mappingdata/maps/ne120np4 # For unsupported, user-specified resolutions # (use year 2010 on ne50np4 grid as an example) # (Assuming the mapping files created in step 1 has a time stamp of '190409' in the filenames and the location of mapping files are '/whatever/directory/you/put/mapping/files') ./mksurfdata.pl -res usrspec -usr_gname ne50np4 -usr_gdate 190409 -y 2010 -d -dinlc /global/cfs/cdirs/e3sm/inputdata -usr_mapdir /whatever/directory/you/put/mapping/files
(However, ./mksurfdata.pl -h shows -y is by default 2010. When running without "-y" option, standard output says sim_year 2000. I suspect the mksurfdata.pl help information is wrong. To be confirmed.)
Modify namelist file
(Should the correct namelist settings be automatically picked up if the default land build name list settings are modified accordingly?)Time-evolving Land use land cover change (LULCC) data should not be used for fixed-time compsets, but the LULCC information for that particular year should be used (right?)
Manually change to mksrf_fvegtyp = '/global/cfs/cdirs/e3sm/inputdata/lnd/clm2/rawdata/AA_mksrf_landuse_rc_1850-2015_06062017_LUH2/AA_mksrf_landuse_rc_2010_06062017.nc' for the F2010 ne120 compset.Create the land surface data by interactive or batch job
Code Block rm -f surfdata_ne120np4_simyr2010.bash cat <<EOF >> surfdata_ne120np4_simyr2010.bash #!/bin/bash #SBATCH --job-name=mksurfdata2010 #SBATCH --account=acme #SBATCH --nodes=1 #SBATCH --output=mksurfdata.o%j #SBATCH --exclusive #SBATCH --time=00:30:00 #SBATCH --qos=debug # Load modules module load nco module load ncl module load cray-netcdf module load cray-hdf5 # mksurfdata_map is dynamically linked export LIB_NETCDF=$NETCDF_DIR/lib export INC_NETCDF=$NETCDF_DIR/include export USER_FC=ifort export USER_CC=icc export USER_LDFLAGS="-L$NETCDF_DIR/lib -lnetcdf -lnetcdff -lnetcdf_intel" export USER_LDFLAGS=$USER_LDFLAGS" -L$HDF5_DIR/lib -lhdf5 -lhdf5_fortran -lhdf5_cpp -lhdf5_fortran_intel -lhdf5_hl_intel -lhdf5hl_fortran_intel" cd /global/homes/t/tang30/ACME_code/MkLandSurf/components/clm/tools/clm4_5/mksurfdata_map CDATE=c`date +%y%m%d` # current date ./mksurfdata_map < namelist EOF sbatch surfdata_ne120np4_simyr2010.bash
The land surface data in NetCDF format will be created at current directory. (How to verify the file is correct?)
8. Generate a new land initial condition (finidat)
...
Following are the steps for building and running an executable for this tool:
Change directory to tool root:
cd components/cam/tools/mkatmsrffile
Create a .env_mach_specific.sh by running
../../../../cime/tools/configure --macros-format=Makefile
Get machine-specific environment settings via
source .env_mach_specific.sh
Make sure
NETCDF_ROOT
andFC
environment variables are set right for your system, and build the executable:On Cori:
env NETCDF_ROOT=$NETCDF_DIR FC=ifort make
Edit "nml_atmsrf" to update the input file paths
Run the newly built executable
Code Block ./mkatmsrffile
This will produce a drydep file. Following input files were used for generating a new dry deposition file:
...
Tools we should create tests for:
TempestRemap for generating uniform grids
(in Paul’s external git repo - may have its own tests?)SQuadGen for generating RRM grids
(in Paul’s external repo - may have its own tests?)Generate topography via Topography Generation )
needs utilities: components/cam/tools/topo_tool/cube_to_target and comopnents/homme/test/tool
run ncremap (an NCO command) to generate mapping files
cime/tools/mapping/gen_domain_files
mksurfdata.pl to generate the namelist needed to make fsurdat file
use mksurfdata_map for fsurdat
use the interpic_new tool to regrid atmos state to new grid for initial condition
Stuff on branches that we need to get on master:
branch brhillman/add-grid-scripts for the matlab script used to create the dual grid.
PR #2633 to generate domain files without needing the dual grid?
PR #2706 to add command line interface to topography tool to not have to edit source code by hand and recompile to compute subgrid surface roughness
Stuff requiring ESMF and/or the dual grid:
generate mapping files needs ESM_GegridWeightGen?
generate topography file
Tools that could use some clean-up:
smoothtopo.job
script used to run HOMME to apply dycore-specific smoothing to interpolated topography. It would be nice for this to be able to run via command line arguments rather than having to edit the script (which should make this easier to include in an automated workflow), and we should remove dependence on NCL since this is not guaranteed to be available.Replaced with “homme_tool”, 2020/5. see (see Topography Generation )
makegrid.job
script used to run HOMME+NCL to produce the non-optimized dualgrid and latlon descriptions of the SE grid. Again, it would be nice for this to be able to run via command line arguments rather than having to edit the script (which should make this easier to include in an automated workflow), and we should remove dependence on NCL since this is not guaranteed to be available.TR and PG2 grids make this obsolete - we now longer need the “dualgrid”.
Land surface data scripts (TODO: add specifics about what needs to change here)