The purpose of this page is to document the procedure for adding support for new atmosphere grids. The process should be the same for new uniform resolutions as well as for new regionally-refined meshes, although some settings will need to be changed for new regionally-refined mesh configurations. This page is a work in progress, and will be updated as this process is refined and (The purpose of this page is to document the procedure for adding support for new atmosphere grids. The process should be the same for new uniform resolutions as well as for new regionally-refined meshes, although some settings will need to be changed for new regionally-refined mesh configurations. This page is a work in progress, and will be updated as this process is refined and (eventually) made more automated. This documentation is an update of a document written by Mark Taylor and Colin Zarzycki, available as a Google Doc here.
...
File | Tool | Type | Note |
---|---|---|---|
ncks, ncremap | NCO | ||
mapping files and mesh template files | TempestRemap | C++ | GenerateCSMesh: make cubed sphere Exodus (.g) files (spectral element "np" grid) |
ConvertMeshToSCRIP convert a FV "pg" Exodus file into a SCRIP file | |||
RRM Exodus (.g) mesh files | SquadGen | C++ | |
topo files | homme_tool | Fortran | Included with E3SM. Should build and run on any system which can run the HOMME_P24.f19_g16_rx1.A test Generate (obsolete) SCRIP files for the spectral element "np" dual grid Used for topo smoothing for both FV "pg" and SE "np" grids. Can also do parallel interpolation from SE np4 grid to any SCRIP grid |
topo files | cube_to_target | Fortran | NCAR utility for generating unsmoothed topography from high-res USGF data, and generating surface roughness fields from smoothed topography |
mapping files | ESMF_Regridweightgen | Make FV->FV mapping files from SCRIP grid template files Only tool which supports the montone 2nd order "bilin" map | |
| CIME and ELM tools | ||
land surface dataset | mksurfdata.pl, mksurfdata_map, | Perl and Fortran | |
ELM initial condition | interpinic | Fortran | 4 options:
|
Archives of already-created atmosphere grid and mapping files can be accessed at https://web.lcrc.anl.gov/public/e3sm/mapping/grids/ and https://web.lcrc.anl.gov/public/e3sm/mapping/maps/ . This page focuses on creating new versions of these files.
...
Code Block |
---|
output_root=${HOME}/cscratch/e3sm/grids/ne4 mkdir -p ${output_root} |
Types of Atmosphere grid metadata files
See SE Atmosphere Grid Overview (EAM & CAM) for description of the spectral elements, GLL nodes, subcell grid and dual grid.
Exodus file: "ne4.g". This is a netcdf file following Exodus conventions. It gives the corners of all elements on the sphere and their connectivity. It is independent of the polynomial order used inside the element ("np").
This file is used by TempestRemap (TR) to generate mapping files. The polynomial order is a command line option and the GLL nodes are internally generated by TR.
SCRIP file: "ne4pg2.scrip.nc". This file contains a description of the atmosphere physics grid n the format used by the original incremental remap tool SCRIP. It is used for most output and also used to generate mapping files between components and for post-processing of most output.
Less common “GLL” metadata files needed for specialized purposes:
SCRIP file: "ne4np4_scrip.nc". This file contains a description (SCRIP format) of the GLL dual grid. It includes the locations of the GLL nodes and artificial bounding polygons around those nodes. Ideally the spherical area of each polygon will match the GLL weight ("exact GLL areas"), but not all tools can achieve exact areas. Inexact areas does not impact the accuracy of the resulting mapping algorithm, it just means that mass will not be exactly conserved by the mapping algorithm.
Needed in E3SMv2 configurations by the cube_to_target topography dataset generation tool.
Historically used by E3SMv1 to create ocean/atm mapping files (exact GLL areas are important here, to conserve fluxes)
Historically used by E3SMv1 for regional and sub-gridscale regridding post-processing tools ( https://acme-climate.atlassian.net/wiki/spaces/SIM/pages/754286611/Regridding+E3SM+Data+with+ncremap, http://nco.sf.net/nco.html#sgs)
latlon file: "ne4np4_latlon.nc". This file contains a list of all the GLL nodes in the mesh (in latitude/longitude coordinates). The list of GLL nodes must be in the the internal HOMME global id ordering, matching the ordering used in CAM and EAM native grid output. It also contains the connectivity of the GLL subcell grid.
This file is used by CAM's interpic_new utility, and graphics programs Paraview and Visit when plotting native grid GLL output.
...
Step-by-step guide
1. Generate a new atmosphere "grid" file
Requirements: TempestRemap
...
The Exodus file contains only information about the position of the spectral element on the sphere. For SE aware utilities such as TempestRemap, they can use the polynomial order and the reference element map to fill in necessary data such as the locations of the nodal GLL points. For non-SE aware utilities, we need additional meta data, described in the next section.
2A. Generate control volume mesh files for E3SM v2 "pg2" grids
Requirements:
exodus mesh file
TempestRemap
...
Code Block |
---|
${tempest_root}/bin/GenerateVolumetricMesh --in ne4.g --out ne4pg2.g --np 2 --uniform ${tempest_root}/bin/ConvertExodusToSCRIPConvertMeshToSCRIP --in ne4pg2.g --out ne4pg2.scrip.nc |
2B. Generate "dual grid" mesh files (SCRIP and lat/lon format) for E3SM v1 "np4" GLL grids
Requirements:
exodus mesh file
homme_tool
...
Be sure your environment matches the software environment loaded by E3SM by executing the output of this command: e3sm/cime/scripts/Tools/get_case_envUse cmake to configure and compile standalone HOMME. On a supported platform with the CIME environement, this should work out-of-the-box. : e3sm/cime/scripts/Tools/get_case_env
Use cmake to configure and compile standalone HOMME. On a supported platform with the CIME environement, this should work out-of-the-box. See e3sm/components/homme/README.cmake
compile the HOMME tool utility:
cd /path/to/workingdir
make -j4 homme_tool
executable: /path/to/workingdir/src/tool/homme_tool
Edit e3sm/components/homme/test/tool/namelist/template.nl and specify the grid resolution or RRM file
For ne512, this would be
ne = 512
. For RRM grids, leavene = 0
, but will need to edit where the exodus grid file comes fromfor non-RRM grids using the older E3SM v1 dycore, add cubed_sphere_map=0 to template.nl
See e3sm/components/homme/test/tool/README.cmake
compile the HOMME tool utility:
cd /path/to/workingdir
make -j4 homme_tool
executable:
test.job for examples of how to run the script and then use an NCL utilities to process the tool output into SCRIP and latlon formats.
Specific details for running at NERSC on Cori(knl):
Create a batch script hange "account" in the sbatch directives at the top of the script. For example, set
#SBATCH --account=e3sm
Create a batch script hange "account" in the sbatch directives at the top of the script. For example, set
#SBATCH --account=e3sm
cmake -C /path/to/e3sm/components/homme/cmake/machineFiles/cori-knl.cmake -DPREQX_NP=4 /path/to/workingdir
cmake -C /path/to/
workingdire3sm/
src/toolcomponents/homme
_toolEdit e3sm/componentscmake/homme/test/tool/namelist/template.nl and specify the grid resolution or RRM file
For ne512, this would be
ne = 512
. For RRM grids, leavene = 0
, but will need to edit where the exodus grid file comes fromfor non-RRM grids using the older E3SM v1 dycore, add cubed_sphere_map=0 to template.nl
See e3sm/components/homme/test/tool/test.job for examples of how to run the script and then use an NCL utilities to process the tool output into SCRIP and latlon formats.
Specific details for running at NERSC on Cori(knl):
Make sure a working NCL is in your PATH. On Cori, add the following to the script:
module load ncl
machineFiles/cori-knl.cmake -DPREQX_NP=4 /path/to/workingdirMake sure a working NCL is in your PATH. On Cori, add the following to the script:
module load ncl
.
2C. Atmospheric mesh quality
Atmospheric RRM mesh quality can be measured with the “Max Dinv-based element distortion” metric. This will be printed in the log file for standalone HOMME or EAM simulations (and can be obtained from the log files during the topo generation step). It measures how distorted the elements become in the mesh transition region. It is the ratio of the two singular values of the 2x2 derivative matrix of the element map to the unit square, representing the ration of the largest length scale to the smallest length scale. A grid of perfect quadrilaterals will have a value of 1.0. The equal-angle cubed-sphere grid has a value of 1.7. A high quality regionally refined grid will have a value less than 4. With a high quality grid, usually one can run with the timesteps used in a uniform grid with matching fine resolution. RRM grids with a value > 4 may require smaller timesteps for stability. Very large values indicate a problem with the grid and it should be redesigned.
3. Generate mapping files
Requirements:
ESMF_RegridWeightGen
ncremap
grid descriptor files for each component that exists on a different grid
(atmosphere, ocean, possibly land if on a different grid than the atmosphere)
In order to pass data between different components at runtime, a set of mapping files between each component is generated offline. These mapping files will also be used in Step 4 below (generating domain files).
See Transition to TempestRemap for Atmosphere grids for a discussion of different remap algorithms and when to use each.
...
Code Block |
---|
atm_grid_file=ne30pg2.g atm_scrip_grid_file=ne30pg2_scrip.nc ocn_grid_file=ocean.oEC60to30v3.scrip.181106.nc lnd_grid_file=SCRIPgrid_0.5x0.5_nomask_c110308.nc atm_name=ne30pg2 ocn_name=oEC60to30v3 lnd_name=r05 ## Conservative, monotone maps. alg_name=mono date=200110 function run { echo "src $src dst $dst map $map" ncremap -a tempest --src_grd=$src --dst_grd=$dst -m $map \ -W '--in_type fv --in_np 1 --out_type fv --out_np 1 --out_format Classic --correct_areas' \ $extra } extra="" src=$ocn_grid_file dst=$atm_grid_file map="map_${ocn_name}_to_${atm_name}_${alg_name}.${date}.nc" run src=$atm_grid_file dst=$ocn_grid_file map="map_${atm_name}_to_${ocn_name}_${alg_name}.${date}.nc" extra=--a2o run extra="" src=$lnd_grid_file dst=$atm_grid_file map="map_${lnd_name}_to_${atm_name}_${alg_name}.${date}.nc" run src=$atm_grid_file dst=$lnd_grid_file map="map_${atm_name}_to_${lnd_name}_${alg_name}.${date}.nc" run ## Nonconservative, monotone maps. alg_name=bilin src=$atm_scrip_grid_file dst=$lnd_grid_file map="map_${atm_name}_to_${lnd_name}_${alg_name}.${date}.nc" ncremap -a bilinear -s $src -g $dst -m $map -W '--extrap_method nearestidavg' src=$atm_scrip_grid_file dst=$ocn_grid_file map="map_${atm_name}_to_${ocn_name}_${alg_name}.${date}.nc" ncremap -a bilinear -s $src -g $dst -m $map -W '--extrap_method nearestidavg' |
4. Generate domain files
Domain files are needed by the coupler and the land model at runtime. The land model uses the mask to determine where to run and the coupler use the land fraction to merge fluxes from multiple surface types to the atmosphere above them. Domain files are created from the mapping files created in the previous step, using a tool provided with CIME in ${e3sm_root}/cime/tools/mapping/gen_domain_files
. This directory contains the source code for the tool (in Fortran 90) and a Makefile. Cloning E3SM is now required to obtain code within the distribution. To clone E3SM,
...
Code Block |
---|
#!/bin/bash # Setup environment source /global/common/software/e3sm/anaconda_envs/load_latest_e3sm_unified_pm-cpu.sh e3sm_root=${HOME}/codes/e3sm/branches/master # Build gen_domain tool gen_domain=${e3sm_root}/cime/tools/mapping/gen_domain_files/gen_domain cd `dirname ${gen_domain}`/src eval $(${e3sm_root}/cime/CIME/Tools/get_case_env) ${e3sm_root}/cime/CIME/scripts/configure --macros-format Makefile --mpilib mpi-serial gmake # Set paths to mapping files mapping_root="/global/homes/b/bhillma/cscratch/e3sm/grids/ne4" ocn_grid_name=oQU240 atm_grid_name=ne4np4 lnd_grid_name=${atm_grid_name} # run domain generation tool (from output directory) domain_root=${mapping_root} mkdir -p ${domain_root} && cd ${domain_root} for target_grid_name in ${lnd_grid_name} ${atm_grid_name}; do # Find conservative mapping files, use the latest file generated map_ocn_to_target=`ls ${mapping_root}/map_${ocn_grid_name}_to_${target_grid_name}_monotr.*.nc | tail -n1` # Run domain tool code ${gen_domain} -m ${map_ocn_to_target} -o ${ocn_grid_name} -l ${target_grid_name} done |
NOTE - on Perlmutter the “OS” environment variable is not set, so to work around this simply set “OS=LINUX“
If this command is successful, it should produce many domain files in ${output_root}/domain_file
that look something like
Code Block |
---|
domain.lnd.${atm_grid}_${ocn_grid}.${datestring}.nc domain.ocn.${atm_grid}_${ocn_grid}.${datestring}.nc domain.ocn.${ocn_grid}.${datestring}.nc domain.lnd.${lnd_grid}_${ocn_grid}.${datestring}.nc domain.ocn.${lnd_grid}_${ocn_grid}.${datestring}.nc |
5. Generate topography file
Generating the topography and related surface roughness data sets is a detailed process that has been moved to it’s own page, with detailed instructions depending on the model version (V1, V2, V3)
Atmospheric Topography Generation
6. Generate and spin-up a new atmosphere initial condition
Generating a new initial condition for the atmosphere is a two-step process. First, an existing initial condition is interpolated to the target resolution, then the interpolated initial condition is used to spin-up a new initial condition that is in balance and consistent with the dynamics (I am probably not explaining this very well; bottom line, without this step the model will probably blow up if you try to run with the interpolated initial condition, at least for RRM grids). The spin-up requires initially lowering the timestep and increasing the hyperviscosity, and then gradually relaxing these back to more reasonable values. Both of these steps are described below.
Step 1: Generating a "first-guess" initial condition
A starting point for a new initial condition is first interpolated from an existing initial condition. Traditionally, this has been done using the interpic_new
tool, which does both horizontal (for new grids) and vertical (for potentially different vertical grids/numbers of levels) interpolation of the fields in the initial condition file. This is a Fortran code that is included in E3SM, within the atmosphere tools directory. Unfortunately, the tool only supports interpolating from lat/lon grids, and cannot interpolate from unstructured to unstructured. So, if you want to use this tool to interpolate an existing initial condition to an SE grid, you will have to start with an older initial condition on an FV grid or similar. Thus, the use of interpic_new
is NO LONGER SUPPORTED OR RECOMMENDED. The script below for building and using interpic_new
is included only to document the process in case someone wanted to revive this workflow. The script will not work on NERSC, as the paths do not exists:
Code Block |
---|
#!/bin/bash # Get machine-specific modules env_mach_specific=env_mach_specific.sh source ${env_mach_specific} e3sm_root=/home/bhillma/codes/e3sm/branches/master interp_root=${e3sm_root}/components/cameam/tools/interpic_new template_file=${interp_root}/template.nc input_atm_ic_file=/projects/ccsm/inputdata/atm/cam/inic/fv/cami-mam3_0000-01-01_0.9x1.25_L30_c100618.nc nlevels=72 atm_latlon_file=/gscratch/bhillma/e3sm/grids/conusx8v1/conusx8v1np4b_latlon_170526.nc # specify a file to pull information about vertical levels from if [ ${nlevels} -eq 72 ]; then vertical_file=/projects/ccsm/inputdata/atm/cam/inic/homme/cami_mam3_Linoz_0000-01-ne120np4_L72_c160318.nc elif [ ${nlevels} -eq 30 ]; then vertical_file=/projects/ccsm/inputdata/atm/cam/inic/homme/cami-mam3_0000-01-ne120np4_L30_c110928.nc else echo "No input specified for nlevels=${nlevels}." exit 1 fi datestring=`date +'%y%m%d'` grid_name="conusx8v1" output_root="/gscratch/bhillma/e3sm/grids/${grid_name}" output_atm_ic_file=${output_root}/cami-mam3_0000-01-${grid_name}_L${nlevels}_c${datestring}.nc # copy horizontal coordinates to template file /projects/ccsm/nco/bin/ncks -O -v lat,lon ${atm_latlon_file} ${template_file} # copy vertical coordinates from existing initial condition to template file /projects/ccsm/nco/bin/ncks -A -v hyai,hybi,hyam,hybm ${vertical_file} ${template_file} # build tool cd ${interp_root} make clean gmake # run the interpolation code ./interpic -t ${template_file} ${input_atm_ic_file} ${output_atm_ic_file} # if input initial condition was from an FV grid, rename US->U, VS->V # this is hard-coded above, so yes we need to rename US and VS /projects/ccsm/nco/bin/ncrename -O -v US,U -v VS,V ${output_atm_ic_file} # update configuration file if [ $? -eq 0 ]; then echo "Successfully created ${output_atm_ic_file}." else echo "Something went wrong." exit 1 fi |
...
Note that although this approach is flexible in the source grid, it does require an existing initial condition with the same vertical grid. However, NCO will now do vertical interpolation, and will also wrap TempestRemap horizontal remapping. This is well documented on the NCO homepage (http://nco.sourceforge.net/nco.html).
Step 2: Generate the atmosphere initial condition
There are two general approaches to generating a new atmospheric initial condition file: remapping an existing initial condition file or remapping reanalysis data. Both of these situations can be addressed with the new HICCUP tool.
Generating atmosphere initial condition data using HICCUP
The procedure to generate new initial conditions (outlined here) has been built into into the HICCUP tool (see https://github.com/E3SM-Project/HICCUP ). HICCUP is a set of flexible and robust python routines to automate and streamline the task of generating a new atmospheric initial condition for E3SM. Most of these routines are essentially wrappers to NCO commands, but there are also routines that directly modify the data using xarray. HICCUP is setup to use ERA5 reanlaysis data or regrid an existing atmospheric initial condition file (either horizontal, vertical, or both). HICCUP includes a surface adjustment routine that follows the original adjustment routine published by ECMWF, which was also reproduced in similar tools by Jerry Olson and Wuyin Lin. HICCUP was built with code readability and documentation as a top priority and includes some unit tests. HICCUP has also successfully been used to generate initial conditions for ne1024 grids using large memory analysis nodes available at HPC centers. Development of HICCUP is ongoing and new features are planned, so any feedback or feature requests are welcome.
The spin-up procedure described below is often not needed when using an initial condition derived from ERA5 reanalysis. However, for certain situations a series of spin-up simulations is required for stability. An example of this is when an RCE initial condition needs to be produced from an aqua planet initial condition.
Spinning up the atmosphere
The following procedure is copied from the recommendations in Mark and Colin's Google Doc https://docs.google.com/document/d/1ymlTgKz2SIvveRS72roKvNHN6a79B4TLOGrypPjRvg0/edit on running new RRM configurations (TODO: clean this up and update):
...
With a high quality grid, usually one can run with the uniform se_nsplit value and a slight increase in hypervis_subcycle. The latest version of HOMME estimates Standalone HOMME or EAM simulations using the new grid will compute this number and prints out output “Max Dinv-based element distortion” in the log file. The equal-angle cubed-sphere grid has a value of 1.7. A high quality regionally refined grid will have a value less than 4.
...
During this tuning process, it is useful to compare the smallest ‘dx’ from the atmosphere log file to the smallest ‘dx’ from the global uniform high resolution run. Use the ‘dx’ based on the singular values of Dinv, not the ‘dx’ based on element area. If the ‘dx’ for newmesh.g is 20% smaller than the value from the global uniform grid, it suggests the advective timesteps might need to be 20% smaller, and the viscous timesteps might need to be 44% smaller (they go like dx^2). The code prints out CFL estimates that are rough approximation that can be used to check if you are in the correct ballpark.
7. Generate land surface data (fsurdat)
Requirements:
land grid descriptor file in SCRIP format
ESMF_RegridWeightGen
geographic distribution for each land surface type along with grid descriptor files for each of those surface types
mkmapdata.sh (found in components/elm/tools/mkmapdata/)
mksurfdata.pl (found in components/clm/tools/clm4_5/mksurfdata_map/)
...
the steps below are for maint1-0 code base. Post-v1 release changes (to add phosphorus) broke existing land initial condition files (finidat) and may require changes to this methodology.
the focus here is on creating an fsurdat file in cases where land use land cover change (LULCC) does NOT change. Additional steps will be needed to create a transient LULCC file.
questions for the land team are in red
NOTE: The land surface generation tool is currently BROKEN in E3SM master, following addition of FAN and fertilizer mods (PR: 5981). A PR was issued to fix this by removing the unsupported FAN dataset from surfdata files, but was subsequently closed (PR: 6237). For the time being, land surface data can be generated by checking out git hash
cd22d14bb8
(master immediately before 5981 was merged).
Steps:
Create mapping files for each land surface type if needed. An (older and deprecated) example of doing this can be found here. Updated instructions follow:
Obtain or generate a target grid file in SCRIP format. For these example, we will use a ne1024pg2 grid file, which we will need to create (note that most np4 grid files can be found within the inputdata repository, for example, the ne1024np4 grid file is at https://web.lcrc.anl.gov/public/e3sm/mapping/grids/ne1024np4_scrip_c20191023.nc). To generate the pg2 SCRIP file:
Code Block ${tempest_root}/bin/GenerateCSMesh --alt --res 1024 --file ne1024.g ${tempest_root}/bin/GenerateVolumetricMesh --in ne1024.g --out ne1024pg2.g --np 2 --uniform ${tempest_root}/bin/ConvertExodusToSCRIPConvertMeshToSCRIP --in ne1024pg2.g --out ne1024pg2_scrip.nc
Get list of input grid files for each land surface input data file. This is done by running the components/elm/tools/mkmapdata/mkmapdata.sh script in debug mode to output a list of needed files (along with the commands that will be used to generate each map file; also make sure GRIDFILE is set to the SCRIP file from the above step):
Code Block language bash cd ${e3sm_root}/components/elm/tools/mkmapdata ./mkmapdata.sh --gridfile ${GRIDFILE} --inputdata-path ${INPUTDATA_ROOT} --res ne1024pg2 --gridtype global --output-filetype 64bit_offset --debug -v --list
Download needed input grid files. The above command will output a list of needed files to
clm.input_data_list
. We need to download all of these before calling the script without the debug flag to actually perform the mapping. This is possible usingcheck_input_data
in CIME, but needs to be done from a dummy case directory. So, one can create a dummy case,cd
to that case, and then call./check_input_data --data-list-dir <path where mkmapdata was run from> --download
. However, this failed to connect to the CESM SVN server for me. So instead, I used the following one-off script:Code Block #!/bin/bash e3sm_inputdata_repository="https://web.lcrc.anl.gov/public/e3sm" cesm_inputdata_repository="https://svn-ccsm-inputdata.cgd.ucar.edu/trunk" inputdata_list=clm.input_data_list cat $inputdata_list | while read line; do localpath=`echo ${line} | while read line; do sed 's:.* = \(.*\):\1:'` localpath=url1=${e3sm_inputdata_repository}/`echo ${line} | sed 's:.* = \(inputdata/lnd/.*\):\1:'` url1url2=${e3smcesm_inputdata_repository}/`echo ${line} | sed 's:.*\(inputdata/lnd/.*\):\1:'` if [ ! -f ${localpath} ]; then echo "${url1} -> ${localpath}" mkdir -p `dirname ${localpath}` cd `dirname url2=${cesm_inputdata_repository}/`echo ${line} | sed 's:.*\(inputdata/lnd/.*\):\1:'` if [ ! -f ${localpath} ]; thenlocalpath}` # Try to download using first URL, if that fails then use the second echowget "${url1} || ->wget ${localpath}"url2} else echo mkdir -p `dirname ${localpath}`"${localpath} exists, skipping." cd `dirname ${localpath}` # Try to download using first URL, if that fails then use the second wget ${url1} || wget ${url2} else echo "${localpath} exists, skipping." fi done
Create mapping files. Should just be able to run the above
mkmapdata.sh
command without the–debug --list
flags. We need to append the--outputfile-type 64bit_offset
flag for our large files (no reason not to do this by default anyways):Code Block ./mkmapdata.sh --gridfile ${GRIDFILE} --inputdata-path ${INPUTDATA_ROOT} --res ne1024pg2 --gridtype global --output-filetype 64bit_offset -v
fi done
Create mapping files. Should just be able to run the above
mkmapdata.sh
command without the–debug --list
flags. We need to append the--outputfile-type 64bit_offset
flag for our large files (no reason not to do this by default anyways). NOTE - This step requires NCL, which is no longer part of the E3SM unified environement. If the machine you are using does not have an NCL module, creating a custom environement that includes NCL is an easy work around. Fixing this issue to avoid the NCL dependency will require rewriting the rmdups.ncl and mkunitymap.ncl script in another language (python+xarray would make sense). We will also need to write a version of the gc_qarea() function, unless the geocat project writes a port that we can use (see geocat issue #31).Code Block ./mkmapdata.sh --gridfile ${GRIDFILE} --inputdata-path ${INPUTDATA_ROOT} --res ne1024pg2 --gridtype global --output-filetype 64bit_offset -v
Notes for Perlmutter (2024) - If you come across an error using ESMF that includes “This functionality requires ESMF to be built with the PIO library enabled
“ in the log file, then this means you are using a version of ESMF that is not compiled with MPI support. If using a conda env then you can check the type of build using conda list | grep esmf
and look for “mpi” or “nompi” in the build tag. The ESMF library can also be problematic when creating the “HYDRO1K“ map (make sure to use the SCRIP version of that grid!). Recent attempts to create this map resulted in segmentation faults, which seem to be a problem only in recent versions. In particular, version 8.6 was found to be problematic, but rolling back to version 8.0.1 was successful.
Compile surface dataset source code
NOTE: previously, editing the makefile${e3sm_root}/components/
...
elm/tools
...
/mksurfdata_map/src/Makefile.
...
common
was required on most machines
...
, but this
...
was fixed
...
by https://github.com/E3SM-Project/E3SM/pull/2757):
Code Block # Setup environment (should work on any E3SM-supported machine) eval $(${e3sm_root}/cime/CIME/Tools/get_case_env) ${e3sm_root}/cime/CIME/scripts/configure --macros-format Makefile --mpilib mpi-serial source .env_mach_specific.sh # Build mksurfdata_map cd ${e3sm_root}/components/elm/tools/mksurfdata_map/src INC_NETCDF="`nf-config --includedir`" \ LIB_NETCDF="`nc-config --libdir`" USER_FC="`nc-config --fc`" \ USER_LDFLAGS="`nf-config --flibs`" make
Note for Perlmutter (
...
Mar 2024 - Walter Hannah) -
...
the build worked using the command below:
Code Block cd ${e3sm_root}/components/elm/tools/mksurfdata_map/src ${e3sm_root}/cime/CIME/scripts/configure && source .env_mach_specific.sh make clean INC_NETCDF="`nf-config --includedir`" \ LIB_NETCDF="`nc-config --libdir`" \ USER_FC="`nc-config --fc`"
...
\ USER_
...
LDFLAGS=
...
"-
...
L`nc-
...
config -
...
-
...
libdir` -
...
lnetcdf -
...
lnetcdff" \ make
Run the mksurfdata.pl script in "debug" mode to generate the namelist (use year 2010 on ne120np4 grids as an example).
Code Block # For supported resolutions #(use year 2010 on ne120np4 grids as an example) cd $e3sm_dir/components/elm/tools/mksurfdata_map ./mksurfdata.pl -res ne120np4 -y 2010 -d -dinlc /global/cfs/cdirs/e3sm/inputdata -usr_mapdir /global/cfs/cdirs/e3sm/inputdata/lnd/clm2/mappingdata/maps/ne120np4 # For unsupported, user-specified resolutions # (use year 2010 on ne50np4 grid as an example) # (Assuming the mapping files created in step 1 has a time stamp of '190409' in the filenames and the location of mapping files are '/whatever/directory/you/put/mapping/files') ./mksurfdata.pl -res usrspec -usr_gname ne50np4 -usr_gdate 190409 -y 2010 -d -dinlc /global/cfs/cdirs/e3sm/inputdata -usr_mapdir /whatever/directory/you/put/mapping/files
(However, ./mksurfdata.pl -h shows -y is by default 2010. When running without "-y" option, standard output says sim_year 2000. I suspect the mksurfdata.pl help information is wrong. To be confirmed.)
Modify namelist file
(Should the correct namelist settings be automatically picked up if the default land build name list settings are modified accordingly?)Time-evolving Land use land cover change (LULCC) data should not be used for fixed-time compsets, but the LULCC information for that particular year should be used (right?)
Manually change to mksrf_fvegtyp = '/global/cfs/cdirs/e3sm/inputdata/lnd/clm2/rawdata/AA_mksrf_landuse_rc_1850-2015_06062017_LUH2/AA_mksrf_landuse_rc_2010_06062017.nc' for the F2010 ne120 compset.Create the land surface data by interactive or batch job
Code Block rm -f surfdata_ne120np4_simyr2010.bash cat <<EOF >> surfdata_ne120np4_simyr2010.bash #!/bin/bash #SBATCH --job-name=mksurfdata2010 #SBATCH --account=acme #SBATCH --nodes=1 #SBATCH --output=mksurfdata.o%j #SBATCH --exclusive #SBATCH --time=00:30:00 #SBATCH --qos=debug # Load modules module load nco module load ncl module load cray-netcdf module load cray-hdf5 # mksurfdata_map is dynamically linked export LIB_NETCDF=$NETCDF_DIR/lib export INC_NETCDF=$NETCDF_DIR/include export USER_FC=ifort export USER_CC=icc export USER_LDFLAGS="-L$NETCDF_DIR/lib -lnetcdf -lnetcdff -lnetcdf_intel" export USER_LDFLAGS=$USER_LDFLAGS" -L$HDF5_DIR/lib -lhdf5 -lhdf5_fortran -lhdf5_cpp -lhdf5_fortran_intel -lhdf5_hl_intel -lhdf5hl_fortran_intel" cd /global/homes/t/tang30/ACME_code/MkLandSurf/components/clm/tools/clm4_5/mksurfdata_map CDATE=c`date +%y%m%d` # current date ./mksurfdata_map < namelist EOF sbatch surfdata_ne120np4_simyr2010.bash
The land surface data in NetCDF format will be created at current directory. (How to verify the file is correct?)
8. Generate a new land initial condition (finidat)
Three options:
cold start: finidat="", no file necessary. Lets us get up and running, but not suitable for climate science applications
Interpolate a spunup state from a previous simulation. This is reasonable for many applications, but not suitable for official published E3SM simulations.
spin-up a new initial condition following best practices from land model developers.
...
what I would recommend in general, is to start the spin-up process with a land cold-start condition, using reanalysis data atmosphere and having all the other land settings just the way you want them for the eventual coupled simulation (resolution, domain file, land physics and BGC settings). Then run for enough years to get an approximate steady state (the number depends on what land options you are using – no BGC means a lot fewer years). Then use the resulting restart file as the initial condition for a coupled run that has at least the same atmosphere settings you will eventually use for your production run. Save high-frequency output for multiple years (preferably 10 or more, but a high res run might not have that luxury). Use that atm output to drive a second offline land spin-up, to equilibrate the land to the expected initial climate from the atmosphere. Then use the resulting land restart as finidat for the start of your fully-coupled spin-up simulation, and let it run for a while to assess drifts in all coupled components. Only after you are satisfied that everything is stable are you in a safe state to begin a science experiment production run. Off-the-shelf finidat file is not likely to save you much time in this process, because it will not be spun up to your experimental conditions and tolerances.
9. Create a new atmospheric dry deposition file
From the README for mkatmsrffile
tool at components/cameam/tools/mkatmsrffile
:
Atmospheric drydeposition at the surface depends on certain surface
properties including soil and land use properties. In most cases
these calculations can be handled in the land model and passed to he
atmosphere through the coupler. This is the default namelist setting
drydep_method='xactive_lnd'. However with modal areosols this method
is not adequate and we must recalculate these fields in the atmosphere
(see subroutine interp_map in mo_drydep.F90). For unstructured grids
it was determined to create this offline interpolation tool rather
than generalize the subroutine interp_map.
...
Change directory to tool root:
cd components/cameam/tools/mkatmsrffile
Create a .env_mach_specific.sh by running
../../../../cime/tools/configure --macros-format=Makefile
Get machine-specific environment settings via
source .env_mach_specific.sh
Make sure
NETCDF_ROOT
andFC
environment variables are set right for your system, and build the executable:On Cori:
env NETCDF_ROOT=$NETCDF_DIR FC=ifort make
Edit "nml_atmsrf" to update the input file paths
Run the newly built executable
Code Block ./mkatmsrffile
...
Output file produced using the above procedure was compared against an existing file (/project/projectdirs/e3sm/inputdata/atm/cam/chem/trop_mam/atmsrf_ne30np4_110920.nc) using a script from Peter Caldwell. Following figures show the comparison:
...
10. Create a new compset and/or new supported grid by modifying CIME's xml files
???
11. Implement
...
tests for the new grid
(TODO: develop this section. One example might be to include a case-building test as done in ${e3sm_root}/cime/config/e3sm/tests.py)
12. Adding a new ocean and sea-ice mesh
If you wish to add a new ocean and sea-ice mesh in addition to (or instead of) a new atmosphere grid, you will need to use the compass
tool to generate the mesh and dynamically adjusted initial condition. This procedure is detailed in a separate tutorial:
https://mpas-dev.github.io/compass/latest/tutorials/dev_add_rrm.html
The new ocean/sea ice mesh should be generated before generating mapping and domain files (steps 3. and 4. above).
...
Summary
After reading through the info above on , I (Peter Caldwell) created lists of stuff we should create tests for, merge to master, and revise to avoid dual-grid dependency. Ben Hillman - am I missing anything?
...
TempestRemap for generating uniform grids
(in Paul’s external git repo - may have its own tests?)SQuadGen for generating RRM grids
(in Paul’s external repo - may have its own tests?)Generate topography via Atmospheric Topography Generation )
needs utilities: components/cameam/tools/topo_tool/cube_to_target and comopnents/homme/test/tool
run ncremap (an NCO command) to generate mapping files
cime/tools/mapping/gen_domain_files
mksurfdata.pl to generate the namelist needed to make fsurdat file
use mksurfdata_map for fsurdat
use the interpic_new tool to regrid atmos state to new grid for initial condition
...
smoothtopo.job
script used to run HOMME to apply dycore-specific smoothing to interpolated topography. It would be nice for this to be able to run via command line arguments rather than having to edit the script (which should make this easier to include in an automated workflow), and we should remove dependence on NCL since this is not guaranteed to be available.Replaced with “homme_tool”, 2020/5. see (see Atmospheric Topography Generation )
makegrid.job
script used to run HOMME+NCL to produce the non-optimized dualgrid and latlon descriptions of the SE grid. Again, it would be nice for this to be able to run via command line arguments rather than having to edit the script (which should make this easier to include in an automated workflow), and we should remove dependence on NCL since this is not guaranteed to be available.TR and PG2 grids make this obsolete - we now longer need the “dualgrid”.
Land surface data scripts (TODO: add specifics about what needs to change here)
...