Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 21 Next »

This page details exceptions to the step-by-step grid setup instructions when using the finite volume (FV) physics grid (a.k.a. "physgrid", pg2, pg3). When generating a new model grid the workflow presented in the step-by-step guide on the parent page is still relevant, but there are several deviations from those steps that are documented here. If the physgrid becomes the default configuration for E3SM then we will likely just merge these notes into the parent page. 

Physgrid Description

The FV physics grid (physgrid) is constructed by creating equal subdivisions of the quadrilateral elements of the cube sphere, typically 2x2 or 3x3. This differs from the "spectral element" method of the GLL grid, in which points within the cube sphere elements represent nodes of continuous polynomial basis functions. When running the model with the physgrid, the dynamics calculations (i.e. advection) are still solved on the GLL grid, but the state is mapped to the physgrid for physics calculations (i.e. clouds and radiation) and the physics tendencies are then mapped back to the GLL grid. The model output is mainly on the physgrid, but certain quantities can also be output on the GLL grid for specialized diagnostics.

Generating New Grid Files

TempestRemap is still used for generating new grid "exodus" file, but an extra step is needed for using the physgrid. For a typical GLL grid the exodus mesh file defines the vertices of the elements, without any information about the internal node structure defined by the nodes and basis functions. For example, if we generate an exodus file for the "ne30" grid we get a cube sphere grid where each cube face has been divided into 30x30 elements. If we wish to use this for mapping between a GLL grid and some other grid with TempestRemap we would need to indicate the number of nodes used to define basis function nodes within each elements (this is typically 4x4 or "np4").

In order to adapt this to the physgrid we need to use an additional command to subdivide these elements into the number of FV cells we want. Similar to the example given above is that when mapping between grids we would need to indicate to TempestRemap that these "elements" are actually finite volume cells with a special flag (see mapping section below).

So if we want an ne30pg2 file, and we initially created an ne30 exodus files with this command:

${tempest_root}/bin/GenerateCSMesh --alt --res 30 --file ${output_root}/ne30.g

Then we would simply use the following command to evenly subdivide the elements into 9 cells:

${tempest_root}/bin/GenerateVolumetricMesh --in ${output_root}/ne30.g --out ${output_root}/ne30pg2.g --np 2 --uniform

Mapping Files

The "ncremap -P mwf" procedure encapsulates several commands to generate all the required map files for a run without the physgrid, but since these commands are specific to using the spectral element grid, we need different commands when using the physgrid. We can implement a comparable procedure in ncremap if we want by submitting a PR to the NCO repository.

The example below shows the commands needed to generate all mapping files for a tri-grid configuration with the atmosphere on the ne30pg2 grid.. 

atm_grid_file=ne30pg2.g
ocn_grid_file=ocean.oEC60to30v3.scrip.181106.nc
lnd_grid_file=SCRIPgrid_0.5x0.5_nomask_c110308.nc
atm_name=ne30pg2
ocn_name=oEC60to30v3
lnd_name=r05
alg_name=mono

map_opts='--in_type fv --in_np 1 --out_type fv --out_np 1 --out_format Classic'
date=200110
ncremap -a tempest --src_grd=$ocn_grid_file --dst_grd=$atm_grid_file -m map_${ocn_name}_to_${atm_name}_${alg_name}.${date}.nc -W $map_opts
ncremap -a tempest --src_grd=$atm_grid_file --dst_grd=$ocn_grid_file -m map_${atm_name}_to_${ocn_name}_${alg_name}.${date}.nc -W $map_opts --a2o
ncremap -a tempest --src_grd=$lnd_grid_file --dst_grd=$atm_grid_file -m map_${lnd_name}_to_${atm_name}_${alg_name}.${date}.nc -W $map_opts 
ncremap -a tempest --src_grd=$atm_grid_file --dst_grd=$lnd_grid_file -m map_${atm_name}_to_${lnd_name}_${alg_name}.${date}.nc -W $map_opts 
ncremap -a tempest --src_grd=$lnd_grid_file --dst_grd=$ocn_grid_file -m map_${lnd_name}_to_${ocn_name}_${alg_name}.${date}.nc -W $map_opts --a2o
ncremap -a tempest --src_grd=$ocn_grid_file --dst_grd=$lnd_grid_file -m map_${ocn_name}_to_${lnd_name}_${alg_name}.${date}.nc -W $map_opts 


The path to these files needs to added to the appropriate section of cime/config/e3sm/config_grids.xml.

Domain Files

No change in the procedure for generating domain files except that the physgrid grid and mapping files described above need to be used because the physics grid is used when communicating with the coupler, which is primarily where the domain files are needed. 

Generating a Scrip File

For mapping GLL grids without TempestRemap (or plotting GLL grid data) we need to "re-interpret" the representation of data on the GLL grid into a finite volume grid. Step #2 in the step-by-step guide discusses how to do this with the "dual grid" approach. It's worth noting that this is not a visually accurate representation of the data, but the area of the FV cells produced by the dual grid are consistent with the GLL weights so that spatial sums and averages can be computed in an intuitive way. Another caveat of the dual grid method is that the generation of the scrip grid description file requires an iterative process that can can take a very long time for large grids.

An advantage of the physgrid is that we don't have to worry about this nonsense because the data is naturally represented by finite volumes. There's a simple TempestRemap command that can quickly convert from exodus to scrip file types using the following command:

${tempest_root}/bin/ConvertExodusToSCRIP --in ne30pg2.g --out ne30pg2_scrip.nc

The resulting scrip file can be used for mapping with ESMF or plotting physgrid data on the native grid. 

Topography

The dycore requires smoothed topography on the GLL grid, and the physics parameterization for turbulent mountain stress operates on the physics (FV) grid. PRs #3267 and #3406 introduced a new file format for topography and related tools that provide consistent topography data on each grid. There are two new tools to support this new treatment of topography.

The tools address the following requirements.

  1. The dycore needs geopotential phi_s at GLL points, and the physics needs phi_s at FV cell centers.
  2. Physics parameterizations need SGH, SGH30, LANDFRAC, and LANDM_COSLAT computed on the FV grid.
  3. We require that the map of the GLL phi_s data be equal to the FV phi_s data.
  4. We want to run HOMME's smoother on GLL phi_s.
  5. We do not want to use the GLL dual grid, which requires using an unscalable tool; we want to use only the FV dual grid, which can be created straightforwardly and quickly using, e.g., TempestRemap.

To meet these requirements, we introduce a new topography file format that has all data at FV points and, in addition, phi_s at the GLL points.

The first tool is a simple converter, for convenience. The input is an old GLL topography file. The output is a topography file in the new GLL-physgrid format. SGH, SGH30, LANDFRAC, and LANDM_COSLAT are not quite as good as with second tool, but this converter works in one step:

$ cat input.nl
&ctl_nl
ne = 30
/
&vert_nl
/
&analysis_nl
tool = 'topo_convert'
infilenames = 'USGS-gtopo30_ne30np4_16xdel2-PFC-consistentSGH.nc', 'USGS-gtopo30_ne30np4pg2_16xdel2-PFC-consistentSGH_converted'
/

mpirun -np 8 homme_tool < input.nl

homme_tool is an executable in ${homme_build}/src/tool/homme_tool produced in a standard configuration and build of standalone homme.

The second tool chain provides the best-quality GLL-physgrid file. It has five steps, illustrated for the case of ne30pg2:

  1. TempestRemap: Create GLL, pg2, and pg4 grids. pg2 is our target. pg4 is a high-resolution intermediate grid.

    # Generate the element mesh.
    ${tempest_root}/bin/GenerateCSMesh --alt --res 30 --file topo2/ne30.g
    # Generate the target physgrid mesh.
    ${tempest_root}/bin/GenerateVolumetricMesh --in topo2/ne30.g --out topo2/ne30pg2.g --np 2 --uniform
    # Generate a high-res target physgrid mesh for cube_to_target.
    ${tempest_root}/bin/GenerateVolumetricMesh --in topo2/ne30.g --out topo2/ne30pg4.g --np 4 --uniform
    # Generate SCRIP files for cube_to_target.
    ${tempest_root}/bin/ConvertExodusToSCRIP --in topo2/ne30pg4.g --out topo2/ne30pg4_scrip.nc
    ${tempest_root}/bin/ConvertExodusToSCRIP --in topo2/ne30pg2.g --out topo2/ne30pg2_scrip.nc
  2. cube_to_target, run 1: Compute phi_s on the pg4 grid.

    ${e3sm_root}/components/cam/tools/topo_tool/cube_to_target \
      --target-grid topo2/ne30pg4_scrip.nc \
      --input-topography topo2/USGS-topo-cube3000.nc \
      --output-topography topo2/ne30pg4_c2t_topo.nc

    These warnings appear to be innocuous:

    sum of weights is negative - negative area? -1.8651290104823655E-009 675 3000
  3. homme_tool: Map phi_s on the pg4 grid to phi_s on the GLL grid. Smooth these GLL phi_s data. Finally, map these GLL phi_s data to pg2.

    $ cat input.nl
    &ctl_nl
    ne = 30
    smooth_phis_numcycle = 16
    smooth_phis_nudt = 28e7
    hypervis_scaling = 0 
    hypervis_order = 2
    se_ftype = 2 ! actually output NPHYS; overloaded use of ftype
    /
    &vert_nl
    /
    &analysis_nl
    tool = 'topo_pgn_to_smoothed'
    infilenames = 'ne30pg4_c2t_topo.nc', 'ne30np4pg2_smoothed_phis'
    /
    
    mpirun -np 8 ./homme_tool < input.nl
  4. cube_to_target, run 2: Compute SGH, SGH30, LANDFRAC, and LANDM_COSLAT on the pg2 grid, using the pg2 phi_s data.

    ${e3sm_root}/components/cam/tools/topo_tool/cube_to_target \
      --target-grid ne30pg2_scrip.nc \
      --input-topography USGS-topo-cube3000.nc \
      --smoothed-topography ne30np4pg2_smoothed_phis1.nc \
      --output-topography USGS-gtopo30_ne30np4pg2_16xdel2.nc
  5. ncks: Append the GLL phi_s data to the output of step 4.

    ncks -A ne30np4pg2_smoothed_phis1.nc USGS-gtopo30_ne30np4pg2_16xdel2.nc

USGS-gtopo30_ne30np4pg2_16xdel2.nc is the final GLL-physgrid topography file.

The second tool chain can be modified if you have a GLL PHIS field that you want to preserve exactly:

  1. Skip steps 2 and 3 above. Instead, use the first tool, the converter, to make USGS-gtopo30_ne30np4pg2_16xdel2-PFC-consistentSGH_converted1 instead of ne30np4pg2_smoothed_phis1. This output nc file has the original GLL PHIS, now called PHIS_d, and the mapped pg2 PHIS data, as well as other pg2 fields we won't use.
  2. Run step 4 above with USGS-gtopo30_ne30np4pg2_16xdel2-PFC-consistentSGH_converted1 instead of ne30np4pg2_smoothed_phis1.
  3. Run two ncks lines:

    # Extract PHIS_d and ncol_d.
    ncks -v PHIS_d USGS-gtopo30_ne30np4pg2_16xdel2-PFC-consistentSGH_converted1.nc tmp.nc
    # Append PHIS_d and ncol_d to the topo file to create the final result.
    ncks -A tmp.nc USGS-gtopo30_ne30np4pg2_16xdel2.nc

Regional Refinement

Regionally refined grids require no special steps when using the physgrid because the regional refinement happens at the element level, and the physgrid only changes the physics column arrangement within the element. 

Steps to Interpolate Between Grids

There's nothing special about these steps, just thought I'd post a basic example in case someone is needing to convert between grids for comparison purposes.

In this particular example I'm interpolating from ne30np4 to ne30pg3. 

${tempest_root}/bin/GenerateOverlapMesh --a ${grid_data_root}/ne30.g --b ${grid_data_root}/ne30pg3.g --out ${grid_output_root}/tmp_overlap_mesh.nc

${tempest_root}/bin/GenerateOfflineMap --in_mesh ${grid_data_root}/ne30.g --out_mesh ${grid_data_root}/ne30pg3.g --ov_mesh ${grid_data_root}/tmp_overlap_mesh.nc --in_type cgll --in_np 4 --out_type fv --out_np 1 --out_double --mono --out_map ${grid_data_root}/tmp_mapping_weights.nc

ncremap -4 -m ${grid_data_root}/tmp_mapping_weights.nc ${input_file_name} ${output_file_name}


  • No labels