Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Overview:

Based on extensive evaluation of AMWG, UV-CDAT, and NCO codes for generating climatology files (see here), we have determined that NCO provides the most correct answers, has the best metadata, and is fastest. Until UV-CDAT bests NCO in these measures we advocate using NCO for creating climatologies.

The NCO command, ncclimo (formerly climo_nco.sh) will generate and regrid all climatology files (NB: from model or measurement monthly input files).  The primary documentation is here. This presentation, given at the Albuquerque workshop on 20151104, conveys much of the information presented below, and some newer information, in a more graphical format. Currently (20161202) the major difference between the commands is that ncclimo fully implements the CF climatology-bounds and climatological statistics (for cell_methods) conventions and extended climatologies, whereas climo_nco.sh does not.

Prerequisites:

Use ncclimo if possible. It requires and comes with NCO version 4.6.In climatology generation mode, the NCO operator ncclimo ingests ``raw'' data consisting of a monthly or annual timeseries of files and from these produces climatological monthly means, seasonal means, and/or annual means. Alternatively, in timeseries reshaping mode, ncclimo will subset and temporally split the input raw data timeseries into per-variable files spanning the entire period. ncclimo can optionally regrid all output files in either mode. The primary documentation is here. This presentation, given at the Albuquerque workshop on 20151104, conveys much of the information presented below, and some newer information, in a more graphical format. 

Prerequisites:

Use ncclimo if possible. It requires and comes with NCO version 4.6.0 and later.  Its predecessor climo_nco.sh (which is deprecated) requires NCO version 4.5.2 or later. The newest versions of NCO are installed on rhea/titan.ccs.ornl.gov at ORNL, pileus.ornl.gov (CADES at ORNL), cooley/mira.alcf.anl.gov at ANL, cori/edison.nersc.gov (NERSC), aims4.llnl.gov (LLNL), roger.ncsa.illinois.edu (NCSA), and yellowstone.ucar.edu (NCAR). The ncclimo and ncremap scripts are hard-coded to find the latest versions automatically, and do not require any module or path changes. To use other (besides the ncclimo and ncremap scripts) NCO executables from the command-line or from your own scripts may require loading modules. This is site-specific and not under my (CZ's) control. At OLCF, for example, "module load gcc" helps to run NCO from the command-line or scripts. For other machines check that the default NCO is recent enough (try "module load nco", then "ncks --version") or use developers' executables/libraries (in ~zender/[bin,lib] on all machines). Follow these directions on the NCO homepage to install on your own machines/directories. It can be as easy as "apt-get install nco", "dnf install nco", or "conda install -c conda-forge nco", or you can build/install from scratch with "configure;make install". 

Using ncclimo:

...

Climatology generation mode (produce monthly + seasonal + annual climos from monthly input files):

The usual way to use ncclimo is to bring up a terminal window and type:

...

When invoked without options ncclimo outputs a handy table of all available options, their long-option synonyms, and some examples. NCO documentation here describes the full meaning of all options. A short summary of the most common options is:

-a: type of DJF average. Either -a scd (default) or -a sdd. scd is means seasonally continuous December. The first month used will be Dec of the year before the start year you specify with -s. sdd is means seasonally discontinuous December. The first month used will be Jan of the specified start year.

...

-v: variable list, e.g., FSNT,AODVIS,PREC.? (yes, regular expressions work so this expands to PRECC,PRECL,PRECSC,PRECSL)

MPAS O/I considerations:

MPAS ocean and ice models currently have their own (non-CESM'ish) naming convention that guarantees output files have the same names for all simulations. By default ncclimo analyzes the "timeSeriesStatsMonthly" analysis member output (tell CZ if you want options for other AM output). ncclimo recognizes input files as being MPAS-style when invoked with "-m mpaso" or "-m mpascice" like this:

...

If/when MPAS O/I generates the _FillValue attributes itself, this step can and should be skipped. All other ncclimo features like regridding (below) are invoked identically for MPAS as for CAM/CLM users although under-the-hood ncclimo does do some special pre-processing (dimension permutation, metadata annotation) for MPAS. A five-year oEC60to30 MPAS-O climo with regridding to T62 takes < 10 minutes on rhea.

Annual climos (produce annual climos from annual input files):

Not all model or observed history files are created as monthly means. To create a climatological annual mean from a series of annual mean inputs, select ncclimo's annual climatology mode with the "-C ann" option:

...

The options "-m mdl_nm" and "-h hst_nm" (that default to "cam" and "h0", respectively) tell ncclimo how to construct the input filenames. The above formula names the files caseid.cism.h.1851-01-01-00000.nc, caseid.cism.h.1852-01-01-00000.nc, and so on. Annual climatology mode produces a single output file (or two if regridding is selected), and in all other respects behaves the same as monthly climatology mode.

Regridding (climos and other files):

Regridding is a standalone operation carried out by ncremap. See the full ncremap documentation for examples of standalone operation (including MPAS!). For convenience, ncclimo will (optionally) call ncremap to regrid during climatology generation to produce climatology files on both the native and desired analysis grids. The Only the ncremap features most relevant to ncclimo are described here. Regridding while producing climos is virtually free, because it is performed on idle nodes/cores after the monthly climatologies have been computed and while the seasonal climatologies are being computed. This load-balancing can save half an hour on ne120 datasets. To regrid, simply pass the desired mapfile name with "-r map.nc", e.g., "-r ${DATA}/maps/map_ne120np4_to_fv257x512_aave.20150901.nc". Although this should not be necessary for normal use, you may pass any options specific to regridding with "-R opt1 opt2". 

...

The above commands perform a climatology without regridding, then with regridding (all climos stored in ${drc_out}), then with regridding and storing regridded files separately. Paths specified by $drc_in, $drc_out, and $drc_rgr may be relative or absolute. An alternative to regridding during climatology generation is to regrid afterwards with ncremap, which has more special features built-in for regridding. To use ncremap to regrid a climatology in $drc_out and place the results in $drc_rgr, use something like
ncremap -I drc_out -m map.nc -O drc_rgrls drc_out/*climo* | ncremap -m map.nc -O drc_rgrout -m map.nc -O drc_rgr
ls drc_out/*climo* | ncremap -m map.nc -O drc_rgr

Timeseries Reshaping mode, aka Splitting:

As of version 4.6.4, ncclimo will automatically switch to timeseries reshaping mode if it receives a list of files through a pipe to stdin, or, alternatively, placed as positional arguments (after the last command-line option), or if neither of these is done and no caseid is specified, in which case it assumes all *.nc files in drc_in constitute the input file list. These examples invoke reshaping mode in the three possible ways:

# Pipe list to stdin
cd $drc_in;ls *mdl*000[1-9]*.nc | ncclimo -v T,Q,RH -s 1 -e 9 -o $drc_out
# List as positional arguments
ncclimo -v T,Q,RH -s 1 -e 9 -o $drc_out $drc_in/*mdl*000[1-9]*.nc
# Read directory
ncclimo -v T,Q,RH -s 1 -e 9 -i $drc_in -o $drc_out
Assuming each input file is a monthly average comprising the variables
@var{T}, @var{Q}, and @var{RH}, then the output will be files
@file{T_000101_000912.nc},
@file{Q_000101_000912.nc}, and
@file{RH_000101_000912.nc}.
@command{ncclimo} @emph{reshapes} the input so that the outputs are
continuous timeseries of each variable taken from all input files.
When necessary, the output is split into segments each containing no
more than @var{ypf_max} (default 50) years of input, i.e.,
@file{T_000101_005012.nc}, @file{T_005101_009912.nc},
@file{T_010001_014912.nc}, etc.

Coupled Runs:

ncclimo works on all ACME models. It can simultaneously generate climatologies for a coupled run, where climatologies mean both native and regridded monthly, seasonal, and annual averages as per the AG specification. Here are template commands for a recent simulation:

...

The atmosphere and ocean model output is significantly larger than the land and ice model output. These commands recognize that by using different parallelization strategies that may (rhea standard queue) or may not (cooley or rhea bigmem queue) be required, depending on the fatness of the analysis nodes, as explained below. As of late 2016 (and NCO v. 4.6.3-alpha03), the MPAS models do not utilize the $caseid option. They use their own, evolving MPAS naming convention. When fed the '-m' options shown above, ncclimo processes the MPAS "hist.am.timeSeriesStatsMonthly" analysis members.

Extended climos:

ncclimo can re-use previous work and produce extended (i.e., longer duration) climatologies by combining two previously computed climatologies (this is called the binary method, and was introduced in NCO 4.6.3-alpha02) or by computing a new climatology from raw monthly model output and then combining that with a previously computed climatology (this is called the incremental method, and was introduced in NCO 4.6.2). Producing an extended climatology by the incremental method requires specifying (with -S and -s, respectively) the start years of the previously computed and current climo and (with -e) the end year of the current climo. Producing an extended climatology by the binary method requires specifying both the start years (with -S and -s) and end years (with -E and -e) of both pre-computed climatologies. The presence of the -E option signifies to ncclimo to employ the binary (not incremental) method.

...

The extended native and regridded climatologies are produced with virtually the same command (only the input and output directories differ). No mapping file or regridding option is necessary to produce an extended climatology from two input regridded climatologies. ncclimo need not know or care whether the input climos are native-grid or are already regridded. So long as the regridded climatologies are already available, it make sense to use them rather than to perform a second regridding. While ncclimo can generate and regrid an extended climatology from native-grid inputs in one command, doing so involves more command-line options and it is generally simpler to follow the above procedure. Ask me if you would like help customizing ncclimo for other such workflows. Producing extended climatologies via the binary method consumes much less memory than producing normal or incremental climatologies. The binary method simply computes weighted averages of each input variable. Hence the maximum RAM required is approximately only three times the size of the largest input variable. This is trivial compared to the total input file size, hence the extended climos may be computed with background parallelism, the default in ncclimo. The '-p mpi' option is never necessary for producing extended climos using the binary method. As you might imagine, the combination of low memory overhead and re-use of previously regridded climos means that producing extended regridded climos via the binary method is extremely fast compared to computing normal climos. Binary climos (regridded or not) require only about 1 minute on Edison.

Memory Considerations:

It is important to employ the optimal ncclimo  parallelization strategy for your computer hardware resources. Select from the three available choices with the '-p par_typ' switch. The options are serial mode ('-p nil' or '-p serial'), background mode parallelism ('-p bck'), and MPI parallelism ('-p mpi'). The default is background mode parallelism, which is appropriate for lower resolution (e.g., ne30L30) simulations on most nodes at high-performance computer centers. Use (or at least start with) serial mode on personal laptops/workstations. Serial mode requires twelve times less RAM than the parallel modes, and is much less likely to deadlock or cause OOM (out-of-memory) conditions on your personal computer. If the available RAM (+swap) is < 12*4*sizeof(monthly input file), then try serial mode first (12 is the optimal number of parallel processes for monthly climos, the computational overhead is a factor of four). CAM-SE ne30L30 output is about ~1 GB per month so each month requires about 4 GB of RAM. CAM-SE ne30L72 output (with LINOZ) is about ~10 GB/month so each month requires ~40 GB RAM. CAM-SE ne120 output is about ~12 GB/month so each month requires ~48 GB RAM. The computer does not actually use all this memory at one time, and many kernels compress RAM usage to below what top reports, so the actual physical usage is hard to pin-down, but may be a factor of 2.5-3.0 (rather than a factor of four) times the size of the input file. For instance, my 16 GB MacBookPro will successfully run an ne30L30 climatology (that requests 48 GB RAM) in background mode, but the laptop will be slow and unresponsive for other uses until it finishes (in 6-8 minutes) the climos. Experiment a bit and choose the parallelization option that works best for you. 

Serial mode, as its name implies, uses one core at a time for climos, and proceeds sequentially from months to seasons to annual climatologies. Serial mode means that climos are performed serially, but regridding will employ OMP threading on platforms that support it, and use up to 16 cores. By design each month and each season is independent of the others, so all months can be computed in parallel, then each season can be computed in parallel (using monthly climatologies), then the annual average can be computed. Background parallelization mode exploits this parallelism and executes the climos in parallel as background processes on a single node, so that twelve cores are simultaneously employed for monthly climatologies, four for seasonal, and one for annual. The optional regridding will employ up to two cores per process. MPI parallelism executes the climatologies on different nodes so that up to (optimally) twelve nodes are employed performing monthly climos. The full memory of each node is available for each individual climo. The optional regridding will employ up to eight cores per node. MPI mode or Background mode on a big memory queue must be used to process ne30L72 and ne120L30 climos on some, but not all, DOE computers. For example, attempting an ne120L30 climo on in background mode on rhea (i.e., on one 128 GB compute node) will fail due to OOM. (OOM errors do not produce useful return codes so if your climo processes die without printing useful information, the cause may be OOM). However the same climo will succeed if executed on a single big-memory (1 TB) node on rhea (use -lpartition=gpu, as shown below). Or MPI mode can be used for any climatology. The same ne120L30 climo will also finish blazingly fast in background mode on cooley (i.e., on one 384 GB compute node), so MPI mode is unnecessary on cooley. In general, the fatter the memory, the better the performance. 

For a Single, Dedicated Node on LCFs:

The basic approach above (running the script from a standard terminal window) works well for small cases can be unpleasantly slow on login nodes of LCFs and for longer or higher resolution (e.g., ne120) climatologies. As a baseline, generating a climatology of 5 years of ne30 (~1x1 degree) CAM-SE output with ncclimo takes 1-2 minutes on rhea (at a time with little contention), and 6-8 minutes on a 2014 MacBook Pro. To make things a bit faster at LCFs, you can ask for your own dedicated node (note this approach doesn't make sense except on supercomputers which have a job-control queue). On rhea do this via:

...

-A: the name of the account to charge for time used. This page may be useful for figuring that out if the above defaults don't work: Computational Resources

For a 12 node, MPI Job:

The above parallel approaches will fail when a single node lacks enough RAM (plus swap) to store all twelve monthly input files, plus extra RAM for computations. One should employ MPI multinode parallelism (-p mpi) on nodes with less RAM than 12*3*sizeof(monthly input).  The longest an ne120 climo will take is less than half an hour (~25 minutes on Edison or Rhea), so the simplest method to run MPI jobs is to request 12-interactive nodes using the above commands (though remember to add -p mpi), then execute the script at the command line. It is also possible, and sometimes preferable, to request non-interactive compute nodes in a batch queue. Executing an MPI-mode climo (on machines with job scheduling and, optimally, 12 available nodes) in a batch queue can be done in 2 commands. First, write an executable file which calls the ncclimo script with appropriate arguments. We do this below by echoing to a file ~/ncclimo.pbs, but you could also open an editor and copy the stuff in quotes below into a file and save it:

...

The basic idea of this script is very simple. For monthly climatologies (e.g. JAN), ncclimo passes the list of all relevant January monthly files to NCO's ncra command, which averages each variable in these monthly files over their time dimension (if it exists) or copies the value from the first month unchanged (if no time axis exists). Seasonal climos are then created by taking the average of the monthly climo files using ncra. In order to account for differing numbers of days per month, the "-w" flag in ncra is used, followed by the number of days in the relevant months. For example, the MAM climo is computed from: "ncra -w 31,30,31 MAR_climo.nc APR_climo.nc MAY_climo.nc MAM_climo.nc" (details about file names and other optimization flags have been stripped here to make the concept easier to follow). The ANN climo is then computed by doing a weighted average of the seasonal climos.

Assumptions, Approximations, and Algorithms (AAA) Employed:

A climatology embodies many algorithmic choices, and regridding from the native to the analysis grid involves still more choices. A separate method should reproduce the ncclimo and NCO answers to round-off precision if it implements the same algorithmic choices. For example, ncclimo agrees to round-off with AMWG diagnostics when making the same (sometimes questionable) choices. The most important choices have to do with converting single- to double-precision (SP and DP, respectively), treatment of missing values, generation/application of regridding weights. For concreteness and clarity we describe the algorithmic choices made in processing a CAM-SE monthly output into a climatological annual mean (ANN) and then regridding that. Other climatologies (e.g., daily to monthly, or annual-to-climatological) involve similar choices.

...