Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

GLL vertex data: dual grid cell fill. (uglyunnatural for GLL data): This is the standard option in NCL and pyNGL for plotting unstructured data. NCL can internally compute a Delaunay triangulation of the data (via http://www.cs.cmu.edu/~quake/triangle.html ), where the data now lives on the vertex of each triangle. NCL then appears to form the Voronoi diagram (dual grid of the triangulation). On this dual grid, the data is now located at the center of each Voronoi cells. Alternatively, the user can read in a dual grid from a SCRIP file and specify the polygon containing each GLL node. With “RasterFill”, each of these dual grid cells will be colored based on the value of the original data set. This results are an accurate representation of all the data points in the original field, but can be quite ugly and is a very unnatural representation of the original spectral element datalook ugly with occasional chevrons, star-like cells and triangles.

GLL vertex data: triangulation followed + Gourard shading (bestlooks good): This approach is not available in NCL, but is available in matplotlib. In this approach, a Delaunay triangulation is first computed. Then each triangle is shaded using a linear interpolating based on its three vertex values. This produces nice looking results. It also has the advantage that it is monotone and min/max/oscillation preserving. All values in the original data set will be plotted. It’s perhaps the best visual representation of GLL vertex dataBut if you zoom in, you can see the triangles, which are unnatural for GLL data. A Gourard/bilinear shading option for vertex data on quads would be a nice option and show the actual GLL grid when zoomed in.

PG2 data: triangularization + Voronio cell fill (unnatural): NCL’s Delaunay approach can also be used for PG2 data: computing a Delaunay triangulation and then the associated Voronoi grid. However, this is a lot of work to construct Voronoi cells when in the PG2 case, the data is already given as a cell average over quadrilaterals. For PG2, using these quadrilaterals instead of Voronio cells (next option below) is a more accurate representation of the discrete data.

...

The algorithm above is probably using bilinear interpolation. This is acceptable, especially when upscalingupsampling. But for the best results:

For GLL and PG2 data, the best algorithm to use (see Transition to TempestRemap for Atmosphere grids Recommended Mapping Procedures for E3SM Atmosphere Grids ) is TempestRemap’s “intbilin”, as this map is accurate, monotone, and can be used for both downscaling downsampling and upscalingupsampling. The only drawback is that it is not exactly conservative. If exact conservation is needed (so that the mass computed on the interpolation grid is the same as on the native grid), then TempestRemap’s less accurate “mono” should be used. For PG2 data, to remap from FV (unstructured) to FV (latlon), we dont yet have an integrated bilinear option. ESMF’s “aave” option ( which can be obtained most efficiently using NCO’s ncremap) is conservative and good for downscalingmono is good for downsampling, but will produce blocky plots when upscalingupsampling. For nicer looking plots when upscalingupsampling, ESMF’s “bilinear” is currently the only option.

...

pyNGL can construct this dual grid from a triangulation if the user specifies sfXArray and sfYArray. WE dont conver We don't cover this option here, but instead cover the second option where the user reads a dual grid from the SCRIP file and specifies that via sfXCellBounds and sfYCellBounds. For PG2 data, this is the most natural approach, with the PG2 dual grid made up of uniform quadrilaterals given in the PG2 SCRIP file.

...

Code Block
#!/usr/bin/env python3

import
plac
import ngl
import numpy
import xarray

datafile = '/global/cfs/cdirs/acme/inputdata/atm/cam/inic/homme/cami_mam3_Linoz_ne4np4_L72_c160909.nc'
gridfile = '/global/cfs/cdirs/acme/mapping/grids/ne4np4-pentagons_c100308.nc'
varname = 'Q'
  
# Read data
ds_data = xarray.open_dataset(datafile)
ds_grid = xarray.open_dataset(gridfile).rename({'grid_size': 'ncol'})
data = ds_data[varname].isel(time=0, lev=-1).squeeze()
lon_edges = ds_grid['grid_corner_lon']
lat_edges = ds_grid['grid_corner_lat']

# Setup the canvas
plot_format='png'
wks = ngl.open_wks(plot_format, './' + varname + '_markers')

# Make map plot
res = ngl.Resources()
res.cnFillOn = True
res.cnLinesOn = False
res.cnFillPalette = 'MPL_viridis'
res.cnFillMode = 'CellFill'
res.sfXCellBounds = lon_edges.values
res.sfYCellBounds = lat_edges.values
res.mpGeophysicalLineColor='white'
res.mpGridAndLimbOn = False
res.lbOrientation = 'horizontal'
res.lbTitleString = '%s (%s)'%(data.long_name, data.units),
res.nglFrame = False
map_plot = ngl.contour_map(wks, data.values, res)

# Add polymarkers for cell centers
polyres = ngl.Resources()
polyres.gsMarkerColor = 'black'
polyres.gsMarkerIndex = 1
poly_plot = ngl.polymarker(
    wks, map_plot,
    ds_grid['grid_center_lon'].values,
    ds_grid['grid_center_lat'].values,
    polyres
)
ngl.frame(wks)

# Close things
ngl.end()

...

Code Block
#!/usr/bin/env python3

from matplotlib import pyplot
from matplotlib.path import Path
from matplotlib.patches import PathPatch
from matplotlib.collections import PatchCollection, PolyCollection
from cartopy import crs
import numpy
import xarray


def plot_unstructured(xv, yv, data, antialiased=False, **kwargs):
    """
    Plot unstructured data. xv and yv specify the x and y coordinates
    of the vertices of each cell and should have shape [ni,nv] where ni
    is the size of the grid (number of cells) and nv is the number of
    vertices for each cell. Data contains the values of the data you
    want to plot, and should have dimension [ni,]. The intent with this
    function is that you will need to read in an auxillary SCRIP-format
    file that describes the grid (unless this information is present in
    the same file that contains the data; unlikely) as well as the file
    that contains the data, and grab the cell corner information from
    the SCRIP file and the data from the data file. This function will
    then plot the data on the native grid by looping over each cell and
    drawing patches for each. Note that this will probably be really
    slow for big grids! Seems to work alright up to about ne120 or so,
    but really some more clever techniques should probably be used here
    (parallelism?).
    
    NOTE: To avoid artifacts due to antialiasing, you should probably pass
    antialiaseds=False to **kwargs.
    """

    # Create array of cell vertices, indexed [npoints, ncorners, 2]
    corners = numpy.stack([xv, yv], axis=2)

    # Go back and fix corners where they wrap; we shouldn't have to do
    # this with cartopy, but it seems we do...
    for i in range(corners.shape[0]):
        if any(corners[i,:,0] < -90) and any(corners[i,:,0] > 90):
            corners[i,:,0] = numpy.where(corners[i,:,0] < -90, corners[i,:,0] + 360, corners[i,:,0])
        if any(corners[i,:,1] < -45) and any(corners[i,:,1] > 45):
            corners[i,:,1] = numpy.where(corners[i,:,1] < -45, corners[i,:,1] + 90, corners[i,:,1])

    # Create a PatchCollection from our aggregated list of PathPatches
    p = PolyCollection(corners, array=data, antialiaseds=antialiased, **kwargs)

    # Add the collection to the axes
    ax = pyplot.gca()
    ax.add_collection(p)

    # Set sane axes limits
    ax.set_xlim([xv.min(), xv.max()])
    ax.set_ylim([yv.min(), yv.max()])

    # Return collection of patches
    return p


# Function to convert longitude from 0-360 to -180 to 180
def fix_lon(lon):
    return numpy.where(lon > 180, lon - 360, lon)


# Plot example
datafile = '/global/cfs/cdirs/acme/inputdata/atm/cam/inic/homme/cami_mam3_Linoz_ne4np4_L72_c160909.nc'
gridfile = '/global/cfs/cdirs/acme/mapping/grids/ne4np4-pentagons_c100308.nc'
varname = 'Q'

# Read data
ds_data = xarray.open_dataset(datafile)
ds_grid = xarray.open_dataset(gridfile).rename({'grid_size': 'ncol'})
data = ds_data[varname]
lon_edges = ds_grid['grid_corner_lon']
lat_edges = ds_grid['grid_corner_lat']

# Reduce data if we need to
if 'time' in data.dims: data = data.isel(time=0).squeeze()
if 'lev' in data.dims: data = data.isel(lev=-1).squeeze()

# Make plot
figure, ax = pyplot.subplots(
    1, 1,
    #subplot_kw=dict(projection=crs.Orthographic(central_longitude=-100))
    subplot_kw=dict(projection=crs.PlateCarree())
)
pl = plot_unstructured(
    fix_lon(lon_edges.values), lat_edges.values, data.values,
    transform=crs.PlateCarree()
)
ax.set_global()
ax.coastlines(color='white', linewidth=0.5)

# Add colorbar to plot
cb = pyplot.colorbar(
    pl, orientation='horizontal',
    label='%s (%s)'%(data.long_name, data.units), pad=0.05
)

# Add markers to plot for GLL centers
pl = ax.plot(
    fix_lon(ds_grid['grid_center_lon'].values),
    ds_grid['grid_center_lat'].values,
    marker=',', markersize=0.5,
    MarkerFaceColor='black', MarkerEdgeColor='black',
    linestyle='none',
)

# Save plot
plotfile = '%s_example.png'%varname
figure.savefig(plotfile, bbox_inches='tight')

...

It may be possible to simplify this example, but so far this is the best solution I’ve come up with to do this particular thing in mpl. Regardless, Some caveats: this does not seem to work well with different map projections, and I’m not sure why yet. Attempts to first project vertices to the map coordinate system and then draw the polygons has also been largely unsuccessful, so this is a work in progress. Regardless, both of these approaches have the downside of representing the GLL point data as cell-centered data, which is maybe not completely appropriate for data on the GLL grid (however, it probably is the best representation of data on the alternative pseudo-finite-volume physics grids, e.g, ne4pg2).

...

Notice that the point at the top of South America still shows the local minimum that is evident in the cell-centered plots above, thus this method should still preserve gradients and oscillations in the data.

Note that the triangulation method can produce weird results (holes in the plot area) when the data spans a plot boundary. For example, as shown above, we need to “fix” the longitude coordinates to go from -180 to 180, or we get a big hole in the plot. This also happens with polar plots, whether we fix the longitudes first or not. A quick example to demonstrate:

Code Block
# Plot using triangulation in lat/lon coordinates
data_proj = crs.PlateCarree()  # data is always assumed to be in lat/lon
projections = (crs.PlateCarree(), crs.Orthographic(central_latitude=60))
for plot_proj in projections:
    figure = pyplot.figure()
    ax = figure.add_subplot(111, projection=plot_proj)
    ax.coastlines(linewidth=0.2)
    ax.set_global()
    # Note, we need fix_lon here using this method
    pl = ax.tripcolor(lon, lat, data, transform=data_proj)

results in the following two figures:

...

We can actually fix both of these examples by first projecting the coordinates explicitly, and then doing the triangulation, rather than doing the triangulation in lat/lon coordinates and then projecting:

Code Block
for plot_proj, proj_name in zip(plot_projs, proj_names):
    figure = pyplot.figure()
    ax = figure.add_subplot(111, projection=plot_proj)
    ax.set_global()
    ax.coastlines(linewidth=0.2)

    # Transform coordinates to target projection
    tcoords = plot_proj.transform_points(data_proj,lon.values,lat.values)
    
    # Subset data
    xi = tcoords[:,0] != numpy.inf
    tc = tcoords[xi,:]
    datai = data[:][xi]  # convert to numpy array, then subset
    
    # Plot in transformed coordinates
    pl = ax.tripcolor(tc[:,0],tc[:,1], datai, shading='flat') # 'gouraud')
    
    # Save figure
    figure.savefig(f'{proj_name}_map-triangulation.png', bbox_inches='tight')

...

The main downside of this method (triangulation) is that it becomes painfully slow for large grids. We may have a couple ways around this. First, we might expect that a significant amount of time is spent calculating the Delaunay triangulation. It turns out this is only partly true, as the following timing results show (using meshes ne4, ne30, ne120, ne240, ne512):

...