E3SM-Unified 1.9.3 release notes
Dear E3SM Team,
A new bug-fix release of E3SM-Unified is available: 1.9.3. E3SM-Unified is a combination conda and spack environment that includes a large amount of analysis and pre- and post-processing software for E3SM users. The new version includes updates to E3SM-Diags, e3sm_to_cmip, ILAMB, MPAS-Analysis, xcdat, and zstash packages.
Activation
As in previous versions, you can access the environment as usual by sourcing an activation script:
Acme1:
source /p/user_pub/e3sm_unified/envs/load_latest_e3sm_unified_acme1.sh
Andes:
source /ccs/proj/cli115/software/e3sm-unified/load_latest_e3sm_unified_andes.sh
Anvil:
source /lcrc/soft/climate/e3sm-unified/load_latest_e3sm_unified_anvil.sh
Chicoma:
source /usr/projects/e3sm/e3sm-unified/load_latest_e3sm_unified_chicoma-cpu.sh
Chrysalis:
source /lcrc/soft/climate/e3sm-unified/load_latest_e3sm_unified_chrysalis.sh
Compy:
source /share/apps/E3SM/conda_envs/load_latest_e3sm_unified_compy.sh
Frontier:
source /ccs/proj/cli115/software/e3sm-unified/load_latest_e3sm_unified_frontier.sh
Perlmutter (CPU nodes):
source /global/common/software/e3sm/anaconda_envs/load_latest_e3sm_unified_pm-cpu.sh
ALCF Polaris:
source /lus/grand/projects/E3SMinput/soft/e3sm-unified/load_latest_e3sm_unified_polaris.sh
Details
A few critical bugs were revealed after the release of 1.9.2 in January and these are fixed here. The new version has been deployed on most supported machines: Andes, Anvil, Chicoma, Chrysalis, Compy, Frontier, Perlmutter and ALCF Polaris (not to be confused with the E3SM Polaris software). It will be deployed on Acme1 later this week.
Note: We encourage users at OLCF to use Andes, rather than Frontier, for processing and analysis.
On 6 machines (Anvil, Chicoma, Chrysalis, Compy, Frontier and Perlmutter) there are 6 packages of interest -- ESMF, ILAMB, MOAB, NCO, TempestExtremes and TempestRemap -- that have been built with Spack using system compilers and MPI libraries. When you load E3SM-Unified on a compute node, you will have access to these versions, which can be run in parallel and which will typically run more efficiently than their counterparts in conda packages.
Bug fixes and improvements in 1.9.3
ChemDyg 0.1.5:
Save plotting data in netcdf format
Available plotting the results with 80 model layers
Provide ChemDyg input data sources on zedono https://doi.org/10.5281/zenodo.8274422
E3SM-Diags 2.11.0 :
Initial support for variable naming convention for EAMxx
Support comparing monthly mean climatology (in addition to seasonal and annual mean)
e3sm_to_cmip 1.11.2:
Update
parallel=False
inopen_mfdataset()
to avoidNetCDF: Not a Valid ID
errorRevised e3sm_to_cmip --info handling for clearer outputs based on frequency and CMIP table
MPAS-Analysis 1.10.0:
This release adds melt rate analysis from Paolo et al. (2023) and a new ocean conservation task, and fixes an out-of-memory bug in very long (>1000 year) sea-ice time series, as well as other bug fixes and clean up.
NCO 5.2.2:
The timeseries splitter in
ncclimo
can now automatically generate monthly filenames, and so can be invoked analogously to climo-generation with specified start/end dates, rather than via stdinncks
can now generate gridded datasets from ELM initial condition and restart datasets.ncks --s1d
rolls up 1D input landunit arrays into gridded output arrays with, e.g., PFT or MEC dimensionsncclimo --glb_avg
fixes omission of time_bounds in ELM regional average timeseries
zppy 2.4.0:
Add native EAMxx support
Add lat_lon_land from e3sm_diags for land diagnostics
Remove waves and eke from MPAS-Analysis generate option
Add
e3sm_to_cmip_environment_commands
parameter
Next version
Testing of the next version (1.10.0) is planned to begin already in 2 weeks (mid April 2024) and deployment is expected before May 1st, 2024. This is to make new features available for the upcoming E3SM Tutorial. To request packages and versions to include, make a comment on Next Version.
As always, please email me if there are any questions, or post an issue at https://github.com/E3SM-Project/e3sm-unified/issues.
Cheers,
Xylar and the Infrastructure Team