Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.




Warning
iconfalse

  NERSC Directory Change Notice  

Due to project's name change at NERSC from 'ACME' to 'E3SM' and NERSC’s file system update, the directory  '/project/projectdirs/acme/' is now '/cfs/cdirs/e3sm'


Table of Contents

Table of Contents

Child Pages

Child pages (Children Display)

The purpose of this page is to document the procedure for adding support for new atmosphere grids. The process should be the same for new uniform resolutions as well as for new regionally-refined meshes, although some settings will need to be changed for new regionally-refined mesh configurations. This page is a work in progress, and will be updated as this process is refined and (eventually) made more automated. This documentation is an update of a document written by Mark Taylor and Colin Zarzycki, available as a Google Doc here. Linking to a related page: /wiki/spaces/NGDNA/pages/915408074.

Similar page for MPAS grids:  /wiki/spaces/ECG/pages/1479835665


File

Tool

Type

Note


ncks, ncremap

NCO


mapping files and mesh template files

TempestRemap

C++

GenerateCSMesh:  make cubed sphere  Exodus (.g) files (spectral element "np" grid)
GenerateVolumetricMesh  create a FV "pg" from a spectral element "np" grid
ConvertExodusToSCRIP convert a FV "pg" Exodus file into a SCRIP file
GenerateOverlapMesh:  used in making mapping files
GenerateOfflineMap:  generate mapping files.  Only tool which can make mapping files directly from SE Exodus files.  

RRM Exodus (.g)  mesh files

SquadGen

C++


topo files

homme_tool

Fortran

Included with E3SM.  Should build and run on any system which can run the HOMME_P24.f19_g16_rx1.A test

Generate (obsolete) SCRIP files for the spectral element "np" dual grid

Used for topo smoothing for both FV "pg" and SE "np" grids. 

Can also do parallel interpolation from SE np4 grid to any SCRIP grid

topo files

cube_to_target

Fortran

NCAR utility for generating unsmoothed topography from high-res USGF data, and generating surface roughness fields from smoothed topography

mapping files

ESMF_Regridweightgen


Make FV->FV mapping files from SCRIP grid template files

Only tool which supports the montone 2nd order "bilin" map


gen_domain_files 

CIME and ELM tools


land surface dataset 

mksurfdata.pl, mksurfdata_map,

Perl and Fortran


ELM initial condition

interpinic

Fortran

4 options:

  1. cold start: (no IC file).  only suitable for testing functionality in other components.

  2. Run long spinup with prescribed atmosphere 

  3. Interpolate from a spunup IC file from a different simulation, via "interpinc" utility

  4. Inline interpolation for land initial condition is available in E3SM code, but this capability might get broken with the new land subgrid structure




For the purpose of this step-by-step guide, we will walk through each step using an example of regenerating the ne4 grid that is already currently supported. For a new grid (i.e., ne512), just replace ne4 with the desired grid identifier. This is determined in Step 1 Generate a new "grid" file below, and the rest should follow.

...

See SE Atmosphere Grid Overview (EAM & CAM) for description of the spectral elements, GLL nodes, subcell grid and dual grid.   

  • Exodus file: "ne4.g".   This is a netcdf file following Exodus conventions.  It gives the corners of all elements on the sphere and their connectivity.  It is independent of the polynomial order used inside the element ("np").  

    • This file is used by TempestRemap (TR) to generate mapping files.  The polynomial order is a command line option and the GLL nodes are internally generated by TR.  

  • latlon file:  "ne4np4_latlon.nc".   This file contains a list of all the GLL nodes in the mesh (in latitude/longitude coordinates).   The list of GLL nodes must be in the the internal HOMME global id ordering, matching the ordering used in CAM and EAM native grid output.   It also contains the connectivity of the GLL subcell grid.   

    • This file is used by CAM's interpic_new utility, and graphics programs Paraview and Visit when plotting native grid output.

  • SCRIP file:  "ne4np4_scrip.nc".   This file contains a description of the GLL dual grid, in the format used by the original incremental remap tool SCRIP.  It includes the locations of the GLL nodes and artificial bounding polygons around those nodes.   Ideally the spherical area of each polygon will match the GLL weight ("exact GLL areas"), but not all tools can achieve exact areas.  Inexact areas does not impact the accuracy of the resulting mapping algorithm, it just means that mass will not be exactly conserved by the mapping algorithm.  


...

Step-by-step guide

1. Generate a new "grid" file

...

TempestRemap needs to be built from source from the Github repository. This is straightforward on Cori and Edison. Note that a parallel version of netCDF should be used. This can be accomplished on cori/edison be executing module swap cray-netcdf cray-netcdf-hdf5parallel or by sourcing an env_mach_specific.sh from a working case on the target machine before building. Then, 

...

2A. Generate "dual grid" mesh files for E3SM v2 "pg2" grids 

Requirements:

  • exodus mesh file

  • TempestRemap

In E3SM v2, we will be switching to running physics on a FV pg2 grid and mapping files between the atmosphere and other components will be FV to FV type maps using the pg2 grids.  These can be generated by TempestRemap directly from the exodus mesh file:

...

2B. Generate "dual grid" mesh files (SCRIP and lat/lon format) for E3SM v1 "np4" GLL grids

Requirements:

  • exodus mesh file

  • Matlab or Fortran utility.

Note: in E3SM v2, we will be switching to running physics on a FV pg2 grid and mapping files between the atmosphere and other components will be FV to FV type maps using the pg2 grids.   The spectral element "np4" grid is still used internally by the dynamics and for initial conditions, so the metadata described in this section is still needed for some analysis and preparation of initial conditions.   

...

There are three separate codes that can create a dual grid from our SE grids:

  1. Matlab code.  This code produces the best SCRIP files, with nice (non-starlike) control volumes around each GLL node, with exact areas (spherical area = GLL weight).   But the serial code is painfully slow and thus is only practical for grids with up to 100,000 elements.  This code only supports the element-local map.  Noel Keen profiled the matlab script and found that at least for ne30 case, ~90% of the time is spent in computing the area of a polygon which is called millions of times. There may be some simple things to try to improve outside of rewriting.

  2. Fortran code.   This utility is included with standalone HOMME. It runs quickly, in parallel, and produces exact GLL areas for cubed-sphere meshes.  It supports both element maps via namelist options.  But for RRM meshes it produces suboptimal control volumes (including star shaped control volumes) with only approximate areas (spherical area ≠ GLL weight). The resulting RRM SCRIP file is thus not acceptable for generating mapping files for coupled simulations, but they should be fine for generating surface roughness fields.  

  3. NCAR utility.  Will update when we find out more about this.  

For the majority of use cases, optimizing the area and shape of each polygon in the dual grid is probably not necessarily, and a simple (and fast) approach can be taken using the Fortran code (plus some NCL scripts), which is packaged with HOMME. For conservative remapping, optimal areas that exactly match the GLL weights probably is required, and the slower Matlab code will most likely need to be used (NOTE: we plan to rewrite the Matlab code in a compiled language to work with higher resolution grids, but for the time being we will work with the less accurate Fortran/NCL codes for the few use cases we still need the SCRIP files for).

...

To run the fortran code on a E3SM supported platform:  (updated 2020/5/25 with updated tool from  https://github.com/E3SM-Project/E3SM/pull/3593 )

  1. Be sure your environment matches the software environment loaded by E3SM by executing the output of this command:   e3sm/cime/scripts/Tools/get_case_env

  2. Use cmake to configure and compile standalone HOMME.  On a supported platform with the CIME environement, this should work out-of-the-box.  See e3sm/cime/components/homme/README.cmake

  3. compile the HOMME tool utility: 

    1. cd /path/to/workingdir 

    2. make -j4 homme_tool

    3. executable:   /path/to/workingdir/src/tool/homme_tool

  4. Edit e3sm/components/homme/test/tool/namelist/template.nl and specify the grid resolution or RRM file

    1. For ne512, this would be set ne = 512. For RRM grids, leave ne = 0, but will need to edit where the exodus grid file comes from

    2. for non-RRM grids using the older E3SM v1 dycore, add cubed_sphere_map=0 to template.nl

  5. See e3sm/components/homme/test/tool/test.job for examples of how to run the script and then use an NCL utilities to process the tool output into SCRIP and latlon formats.  

Specific details for running at NERSC on Cori(knl):

  1. Create a batch script hange "account" in the sbatch directives at the top of the script. For example, set #SBATCH --account=e3sm

  2. cmake -C /path/to/e3sm/components/homme/cmake/machineFiles/cori-knl.cmake  -DPREQX_NP=4 /path/to/workingdir

  3. Make sure a working NCL is in your PATH. On Cori, add the following to the script: module load ncl.

With these changes, run makegrid.job TWICE to build and then run the code. The first pass builds HOMME, and the second pass runs HOMME and the NCL utilities. In a HPC environment, the second pass needs to submitted to the batch queue via sbatch -C knl makegrid.job. This will create the grid descriptor files in ${HOME}/scratch1/preqx/template.

The Matlab Code

The tool chain described below only supports the element-local map. As noted above, this method produces the highest-quality output but is extremely slow (days to weeks) to run. Our preferred mapping algorithms (Transition to TempestRemap for Atmosphere grids) also only support the element-local map.   

...

Now we can run the code by executing the run_dualgrid.sh script from the ${PreAndPostProcessingScripts}/regridding/spectral_elements_grid_utilities/compute_dualgrid directory. Note that a working Matlab environment is required to run the code. On NERSC, this can be accomplished by simply executing module load matlab before running the run_dualgrid.sh script.

...

3. Generate mapping files

Requirements:

  • TempestRemap

  • ESMF_RegridWeightGen

  • ncremap

  • grid descriptor files for each component that exists on a different grid
    (atmosphere, ocean, possibly land if on a different grid than the atmosphere)

In order to pass data between different components at runtime, a set of mapping files between each component is generated offline. These mapping files will also be used in Step 4 below (generating domain files).

...

The easiest way to make sure ncremap is up to date and will work for this workflow, we can source the ESMF-Unified conda environment. Instructions for supported machines can be found herefound here. For Edison and Cori, it is as simple as:

...

In the above code, the "-P mwf" option triggers a specific procedure that invokes multiple ncremap commands which each invoke multiple Tempest commands.It produces 3 mapping files each for a2o and o2a. This multitude of mapping files for each component→component case are needed because fluxes need conservative mapping and states need non-conservative mapping. Because "-P mwf" results in a great deal of output being flushed to screen, it is mostly suppressed by default. To see all the output or to figure out how to run all the nco and tempest remap commands one-by-one, add "--dbg_lvl=2" to the ncremap command. This will print the commands but not execute them. The user can then use one of the printed commands, again with "--dbg_lvl=2" to see the actual commands being sent to Tempest.

...

Code Block
map_${atm_grid}_to_${ocn_grid}_${method}.${datestring}.nc
map_${ocn_grid}_to_${atm_grid}_${method}.${datestring}.nc
map_${atm_grid}_to_${lnd_grid}_${method}.${datestring}.nc
map_${lnd_grid}_to_${atm_grid}_${method}.${datestring}.nc
map_${lnd_grid}_to_${ocn_grid}_${method}.${datestring}.nc
map_${ocn_grid}_to_${lnd_grid}_${method}.${datestring}.nc

where ${ocn_grid}, $ ${atm_grid}, and ${lnd_grid} are the names of the ocean and atmos grid provided above in the --nm_src and --nm_dst arguments, and ${datestring} is the date string provided above in the --dt_sng argument to ncremap.   The ${method} can be monotr, highorder, mono, intbilin, aave, blin, ndtos, nstod, and patc.  (TODO:  What do all these mean?)   If using the tri-grid option, the land grid files will be created from the ${lnd_grd}.  

...

5. Generate topography file 

Requirements:

  • High resolution USGS topography file named USGS-topo-cube3000.nc (located in the CESM inputdata server here; note this is on a 3 km cubed sphere grid)

  • SCRIP-format atmosphere grid file

  • topography tool (components/cam/tools/topo_tool/cube_to_target)

  • homme_tool  (same tool used to create SCRIP files described above).

Topography needs to be interpolated from a high resolution USGS file, and then doctored up a bit to allow the model to run stably with the new topography. The tool chain used to compute topography is documented in the following paper:  

...

Step (b) requires running the homme_tool's smoothing algorithm.  The namelist should be edited to specify the resolution (NE value) or grid file (RRM) grid and the amount of smoothing
(TODO: allow running this script with command line arguments instead of manual editing so that the task can be automated).   

For physics and dynamics on the GLL grid (E3SM V1): see Topography generation for GLL grids

...

Two options are now available to generate the atmosphere initial condition.   The first method is document on the page: Generate atm initial condition from analysis data.  The second method is documented here in the section Spinning up the atmosphere.

Option 1:  Generate atm initial condition from analysis data

...

The following procedure is copied from the recommendations in Mark and Colin's Google Doc https://docs.google.com/document/d/1ymlTgKz2SIvveRS72roKvNHN6a79B4TLOGrypPjRvg0/edit on running new RRM configurations (TODO: clean this up and update):

Determining stable timestep values is also complicated by spinup.  If the initial condition file is very much out of balance (say it came from real planet or a very different aqua planet simulation), then you may require very small timesteps and larger hyperviscosity coefficients in order to get past an initial adjustment period.  Only perform the tuning of se_nsplit and hypervis_subcycle with a spun up initial condition file.

With a high quality grid, usually one can run with the uniform se_nsplit value and a slight increase in hypervis_subcycle.   The latest version of HOMME estimates this number and prints out “Max Dinv-based element distortion”. The equal-angle cubed-sphere grid has a value of 1.7.   A high quality regionally refined grid will have a value less than 4.

Recommend procedure:

  • Run 10 days with very large se_nsplit and hypervis_subcycle options and create a new IC file (see INITHIST in CAM).  For aqua planet, this step is often not necessary, but for simulations with topography this step is critical. One may also need to reduce dtime and increase the viscosity coefficients.  In an extreme case, something like this might be needed:

    1. run 5-10 days with dtime 3x smaller than the default  (and viscosity coefficients 2x larger)

    2. restarting from step a, run 5-10 days with dtime 3x smaller and default viscosity coefficients

    3. restarting from step b, run 5-10 days with dtime 2x smaller and default viscosity coefficients

    4. restarting from step c, run 5-10 days with default dtime and se_nsplit 2x larger  

    5. restarting from step d, run with all default parameters

  • Use this new IC file for all future runs below

  • First we determine a stable value of se_nsplit.  To do this, we first ensure the viscosity timestep is not causing problems (this is especially true if nu_div > nu), so start with a large value of hypervis_subcycle, say hypervis_subcycle=20

  • Find the smallest value of se_nsplit for which the code is stable using 1 month runs.  Start with se_nsplit that is used by the corresponding high resolution uniform grid (assuming they all have the same physics timestep, dtime)

  • One a stable value of se_nsplit has been found, decrease hypervis_subcycle until the smallest stable value is found.  Note that these are not independent. You have to find the stable value of se_nsplit before finding the stable value of hypervis_subcycle.

  • Final step:  The procedure outlined above can find timesteps that are borderline unstable, but don’t blow up do to various dissipation mechanisms in CAM.  Hence it is a good idea to run 3 months, and look at the monthly mean OMEGA500 from the 3rd month. This field will be noisy, but there should not be any obvious grid artifacts.  Weak instabilities can be masked by the large transients in flow snapshots, so it best to look at time averages.

  • Note that for simulations with topography, we often increase nu_div.  This can trigger a restrictive CFL condition which requires reducing hypervis_subcycle.  

During this tuning process, it is useful to compare the smallest ‘dx’ from the atmosphere log file to the smallest ‘dx’ from the global uniform high resolution run.  Use the ‘dx’ based on the singular values of Dinv, not the ‘dx’ based on element area. If the ‘dx’ for newmesh.g is 20% smaller than the value from the global uniform grid, it suggests the advective timesteps might need to be 20% smaller, and the viscous timesteps might need to be 44% smaller (they go like dx^2).  The code prints out CFL estimates that are rough approximation that can be used to check if you are in the correct ballpark.

7. Generate land surface data (fsurdat)

Requirements:

Overview:

A large number of input files need to be interpolated for the land model. This (rather awkward) workflow uses a chain of tools that downloads the inputdata and grid descriptor files for each input dataset, generates mapping files from each input data grid to our target grid, and then applies the mapping weight files to do the interpolation.

Notes:

  • the steps below are for maint1-0 code base. Post-v1 release changes (to add phosphorus) broke existing land initial condition files (finidat) and may require changes to this methodology. 

  • the focus here is on creating an fsurdat file in cases where land use land cover change (LULCC) does NOT change. Additional steps will be needed to create a transient LULCC file.

  • questions for the land team are in red

Steps:

  1. Create mapping files for each land surface type if needed. An (older and deprecated) example of doing this can be found here. Updated instructions follow:

    1. Obtain or generate a target grid file in SCRIP format. For these example, we will use a ne1024pg2 grid file, which we will need to create (note that most np4 grid files can be found within the inputdata repository, for example, the ne1024np4 grid file is at https://web.lcrc.anl.gov/public/e3sm/mapping/grids/ne1024np4_scrip_c20191023.nc). To generate the pg2 SCRIP file: 

      Code Block
      ${tempest_root}/bin/GenerateCSMesh --alt --res 1024 --file 
      ${output_root}/
      ne1024.g
      ${tempest_root}/bin/GenerateVolumetricMesh --in 
      ${output_root}/
      ne1024.g --out 
      ${output_root}/
      ne1024pg2.g --np 2 --uniform
      ${tempest_root}/bin/ConvertExodusToSCRIP --in ne1024pg2.g --out ne1024pg2_scrip.nc
       
    2. Get list of input grid files for each land surface input data file. This is done by running

      the components

      the components/clm/tools/shared/mkmapdata/mkmapdata.sh script in debug mode to output a list of needed files (along with the commands that will be used to generate each map file; also make sure GRIDFILE is set to the SCRIP file from the above step):

      Wiki Markup```

       

      Code Block
      languagebash
      cd ${e3sm_root}/components/clm/tools/shared/mkmapdata
      ./mkmapdata.sh --gridfile ${GRIDFILE} --inputdata-path ${INPUTDATA_ROOT} --res ne1024pg2 --gridtype global --output-filetype 64bit_offset --debug -v --list
      
      ```
    3. Download needed input grid files. The above command will output a list of needed files to clm.input_data_list. We need to download all of these before calling the script without the debug flag to actually perform the mapping. This is possible using check_input_data in CIME, but needs to be done from a dummy case directory. So, one can create a dummy case, cd to that case, and then call ./check_input_data --data-list-dir <path where mkmapdata was run from> --download. However, this failed to connect to the CESM SVN server for me. So instead, I used the following one-off script: 

      Code Block
      #!/bin/bash
      e3sm_inputdata_repository="https://web.lcrc.anl.gov/public/e3sm"
      cesm_inputdata_repository="https://svn-ccsm-inputdata.cgd.ucar.edu/trunk"
      inputdata_list=clm.input_data_list
      cat $inputdata_list | while read line; do
          localpath=`echo ${line} | sed 's:.* = \(.*\):\1:'`
          url1=${e3sm_inputdata_repository}/`echo ${line} | sed 's:.*\(inputdata/lnd/.*\):\1:'`
          url2=${cesm_inputdata_repository}/`echo ${line} | sed 's:.*\(inputdata/lnd/.*\):\1:'`
          if [ ! -f ${localpath} ]; then
              echo "${url1} -> ${localpath}"
              mkdir -p `dirname ${localpath}`
              cd `dirname ${localpath}`
              # Try to download using first URL, if that fails then use the second
              wget ${url1} || wget ${url2}
          else
              echo "${localpath} exists, skipping."
          fi
      done
    4. Create mapping files. Should just be able to run the above mkmapdata.sh command without the –debug --list flags. We need to append the --outputfile-type 64bit_offset flag for our large files (no reason not to do this by default anyways):

      Code Block
      ./mkmapdata.sh --gridfile ${GRIDFILE} --inputdata-path ${INPUTDATA_ROOT} --res ne1024pg2 --gridtype global --output-filetype 64bit_offset -v
  2. Compile surface dataset source code (NOTE: ${e3sm_root}/components/clm/tools/clm4_5/mksurfdata_map/src/Makefile.common needs to be edited to build on most machines; this is fixed in https://github.com/E3SM-Project/E3SM/pull/2757):

    Code Block
    # Setup environment (should work on any E3SM-supported machine)
    ${e3sm_dir}/cime/tools/configure --macros-format=Makefile && source .env_mach_specific.sh
    
    # Set environent variables expected by mksurfdata_map Makefile;
    # Note that NETCDF_DIR is probably specific to NERSC and may need
    # to be adjusted for other systems
    export LIB_NETCDF=$NETCDF_DIR/lib
    export INC_NETCDF=$NETCDF_DIR/include
    export USER_FC=ifort
    export USER_CC=icc
    
    # Build mksurfdata_map
    cd $e3sm_dir/components/clm/tools/clm4_5/mksurfdata_map/src/ && gmake


  3. Run the mksurfdata.pl script in "debug" mode to generate the namelist (use year 2010 on ne120np4 grids as an example). 

    Code Block
    # For supported resolutions
    #(use year 2010 on ne120np4 grids as an example)
    cd $e3sm_dir/components/clm/tools/clm4_5/mksurfdata_map
    ./mksurfdata.pl -res ne120np4 -y 2010 -d -dinlc /global/project/projectdirs/acme/inputdata -usr_mapdir /global/project/projectdirs/acme/inputdata/lnd/clm2/mappingdata/maps/ne120np4
    
    # For unsupported, user-specified resolutions
    # (use year 2010 on ne50np4 grid as an example)
    # (Assuming the mapping files created in step 1 has a time stamp of '190409' in the filenames and the location of mapping files are '/whatever/directory/you/put/mapping/files')
    ./mksurfdata.pl -res usrspec -usr_gname ne50np4 -usr_gdate 190409 -y 2010 -d -dinlc /global/project/projectdirs/acme/inputdata -usr_mapdir /whatever/directory/you/put/mapping/files

    (However, ./mksurfdata.pl -h shows -y is by default 2010. When running without "-y" option, standard output says sim_year 2000. I suspect the mksurfdata.pl help information is wrong. To be confirmed.)

  4. Modify namelist file
    (Should the correct namelist settings be automatically picked up if the default land build name list settings are modified accordingly?)

    Time-evolving Land use land cover change (LULCC) data should not be used for fix-time compsets, but the LULCC information for that particular year should be used (right?)
    Manually change to mksrf_fvegtyp = '/global/project/projectdirs/acme/inputdata/lnd/clm2/rawdata/AA_mksrf_landuse_rc_1850-2015_06062017_LUH2/AA_mksrf_landuse_rc_2010_06062017.nc' for the F2010 ne120 compset.

  5. Create the land surface data by interactive or batch job

    Code Block
    rm -f surfdata_ne120np4_simyr2010.bash
    cat <<EOF >> surfdata_ne120np4_simyr2010.bash
    #!/bin/bash
    
    #SBATCH  --job-name=mksurfdata2010
    #SBATCH  --account=acme
    #SBATCH  --nodes=1
    #SBATCH  --output=mksurfdata.o%j
    #SBATCH  --exclusive
    #SBATCH  --time=00:30:00
    #SBATCH  --qos=debug
    
    # Load modules
    module load nco
    module load ncl
    module load cray-netcdf
    module load cray-hdf5
    
    # mksurfdata_map is dynamically linked
    export LIB_NETCDF=$NETCDF_DIR/lib
    export INC_NETCDF=$NETCDF_DIR/include
    export USER_FC=ifort
    export USER_CC=icc
    export USER_LDFLAGS="-L$NETCDF_DIR/lib -lnetcdf -lnetcdff -lnetcdf_intel"
    export USER_LDFLAGS=$USER_LDFLAGS" -L$HDF5_DIR/lib -lhdf5 -lhdf5_fortran -lhdf5_cpp -lhdf5_fortran_intel -lhdf5_hl_intel -lhdf5hl_fortran_intel"
    
    cd /global/homes/t/tang30/ACME_code/MkLandSurf/components/clm/tools/clm4_5/mksurfdata_map
    
    CDATE=c`date +%y%m%d` # current date
    
    ./mksurfdata_map < namelist
    EOF
    
    sbatch surfdata_ne120np4_simyr2010.bash

    The land surface data in NetCDF format will be created at current directory. (How to verify the file is correct?)

...

8. Generate a new land initial condition (finidat)

Three options:

  • cold start:  finidat="", no file necessary.  Lets us get up and running, but not suitable for climate science applications

  • Interpolate a spunup state from a previous simulation.  This is reasonable for many applications, but not suitable for official published E3SM simulations.

  • spin-up a new initial condition following best practices from land model developers.  

From Peter Thornton via email. We may end up changing what we say, but this is a start.

...

  1. Change directory to tool root:
    cd components/cam/tools/mkatmsrffile

  2. Create a .env_mach_specific.sh by running
    ../../../../cime/tools/configure --macros-format=Makefile

  3. Get machine-specific environment settings via
    source .env_mach_specific.sh

  4. Make sure NETCDF_ROOT and FC environment variables are set right for your system, and build the executable:

    1. On Cori:  env NETCDF_ROOT=$NETCDF_DIR FC=ifort make

  5. Edit "nml_atmsrf" to update the input file paths

  6. Run the newly built executable


    Code Block
    ./mkatmsrffile


    This will produce a drydep file. Following input files were used for generating a new dry deposition file:


srfFileName: /project/projectdirs/e3sm/mapping/grids/1x1d.nc

atmFileName: /project/projectdirs/e3sm/mapping/grids/ne30np4_pentagons.091226.nc

landFileName: /project/projectdirs/e3sm/inputdata/atm/cam/chem/trop_mozart/dvel/regrid_vegetation.nc

soilwFileName: /project/projectdirs/e3sm/inputdata/atm/cam/chem/trop_mozart/dvel/clim_soilw.nc

srf2atmFmapname:/project/projectdirs/e3sm/bhillma/grids/ne30np4/atmsrffile/map_1x1_to_ne30np4_aave.nc

Note that if using Tempest Remap to provide mapping files, the above mapping file should be replaced with something that looks like map_1x1_to_ne30np4_mono.nc.

Output file produced using the above procedure was compared against an existing file (/project/projectdirs/e3sm/inputdata/atm/cam/chem/trop_mam/atmsrf_ne30np4_110920.nc) using a script from Peter Caldwell. Following figures show the comparison:Image RemovedImage Removed

...


10. Create a new compset and/or new supported grid by modifying CIME's xml files

...

Tools we should create tests for:

  1. TempestRemap for generating uniform grids
    (in Paul’s external git repo - may have its own tests?)

  2. SQuadGen for generating RRM grids
    (in Paul’s external repo - may have its own tests?)

  3. No longer needed:   run_dualgrid.sh to obtain scrip and latlon files 
    (in PreAndPostProcessingScripts repo; uses a matlab file).  

  4. makegrid.job for generating dualgrid and latlon files (in components/homme/tests/template, may have its own tests?)

    1. Replaced by "homme_tool", tests added 2020/5

  5. smoothtopo.job for applying dycore-specific smoothing to topography (in components/homme/tests/template, may have its own tests?)

    1. Replaced by "homme_tool", tests added, 2020/5

  6. run ncremap (an NCO command) to generate mapping files

  7. components/cam/tools/topo_tool/cube_to_target

  8. cime/tools/mapping/gen_domain_files

  9. mksurfdata.pl to generate the namelist needed to make fsurdat file

  10. use mksurfdata_map for fsurdat

  11. use the interpic_new tool to regrid atmos state to new grid for initial condition


Stuff on branches that we need to get on master:

  1. branch brhillman/add-grid-scripts for the matlab script used to create the dual grid.

  2. PR #2633 to generate domain files without needing the dual grid?

  3. PR #2706 to add command line interface to topography tool to not have to edit source code by hand and recompile to compute subgrid surface roughness


Stuff requiring ESMF and/or the dual grid:

  1. generate mapping files needs ESM_GegridWeightGen?

  2. generate topography file

Tools that could use some clean-up:

  1. smoothtopo.job script used to run HOMME to apply dycore-specific smoothing to interpolated topography. It would be nice for this to be able to run via command line arguments rather than having to edit the script (which should make this easier to include in an automated workflow), and we should remove dependence on NCL since this is not guaranteed to be available.

  2. makegrid.job script used to run HOMME+NCL to produce the non-optimized dualgrid and latlon descriptions of the SE grid. Again, it would be nice for this to be able to run via command line arguments rather than having to edit the script (which should make this easier to include in an automated workflow), and we should remove dependence on NCL since this is not guaranteed to be available.

  3. Land surface data scripts (TODO: add specifics about what needs to change here)