The E3SM model can be run within the E3SM Singularity container. This page describes how to install Singularity on your laptop/workstation, build or download the E3SM container, and run the E3SM model within the container.
Install Singularity
Linux: Install Singularity as described at Singularity Installation.
Mac terminal: https://singularity.lbl.gov/install-mac
Mac desktop app: https://sylabs.io/singularity-desktop-macos/
Windows: https://singularity.lbl.gov/install-windows
Download the E3SM container
The latest version of the E3SM container can be downloaded from e3sm.sif.
Singularity on supported machines
Singularity is already available on some of E3SM’s supported machines.
Anvil: module load singularity/3.5.2 (earlier versions also available).
Theta: https://www.alcf.anl.gov/support-center/theta/singularity-theta
Cooley (part of ALCF) https://www.alcf.anl.gov/support-center/cooley/singularity-cooley
Run the container on Anvil
Log in to Blues at LCRC, reserve an Anvil node for an interactive job, and run a case:
[lukasz@blueslogin4 ~]$ module load singularity
[lukasz@blueslogin4 ~]$ srun --pty -p acme-small -t 01:00:00 /bin/bash
[lukasz@b566]$ singularity shell --hostname singularity e3sm.sif
Singularity> cd E3SM/cime/scripts/
Singularity> ./create_newcase --case singularity.A_WCYCL1850.ne4_oQU240.baseline --compset A_WCYCL1850 --res ne4_oQU240 --mach singularity
Singularity> cd singularity.A_WCYCL1850.ne4_oQU240.baseline/
Singularity> ./case.setup
Singularity> ./case.build
Singularity> ./case.submit
Advanced: Rebuilding the E3SM container
If you would like to make any changes to the container, you can build it from scratch. The E3SM Singularity definition file is available at e3sm.def. After editing the definition file, build a new container:
sudo singularity build e3sm.sif e3sm.def
It may take up to an hour to create a new container.
Run the E3SM developer tests
Get a clone of the master branch with all submodules:
git clone --recursive git@github.com:E3SM-Project/E3SM.git
Add a corresponding machine
element to cime_config/machines/config_machines.xml:
<machine MACH="singularity"> <DESC>Singularity container</DESC> <NODENAME_REGEX>singularity</NODENAME_REGEX> <OS>LINUX</OS> <COMPILERS>gnu</COMPILERS> <MPILIBS>mpich</MPILIBS> <CIME_OUTPUT_ROOT>$ENV{HOME}/projects/e3sm/scratch</CIME_OUTPUT_ROOT> <DIN_LOC_ROOT>$ENV{HOME}/projects/e3sm/cesm-inputdata</DIN_LOC_ROOT> <DIN_LOC_ROOT_CLMFORC>$ENV{HOME}/projects/e3sm/ptclm-data</DIN_LOC_ROOT_CLMFORC> <DOUT_S_ROOT>$ENV{HOME}/projects/e3sm/scratch/archive/$CASE</DOUT_S_ROOT> <BASELINE_ROOT>$ENV{HOME}/projects/e3sm/baselines/$COMPILER</BASELINE_ROOT> <CCSM_CPRNC>$CCSMROOT/tools/cprnc/build/cprnc</CCSM_CPRNC> <GMAKE>make</GMAKE> <GMAKE_J>16</GMAKE_J> <TESTS>e3sm_developer</TESTS> <BATCH_SYSTEM>none</BATCH_SYSTEM> <SUPPORTED_BY>lukasz at uchicago dot edu</SUPPORTED_BY> <MAX_TASKS_PER_NODE>16</MAX_TASKS_PER_NODE> <MAX_MPITASKS_PER_NODE>16</MAX_MPITASKS_PER_NODE> <mpirun mpilib="default"> <executable>mpirun</executable> <arguments> <arg name="num_tasks"> -launcher fork -hosts localhost -np {{ total_tasks }}</arg> </arguments> </mpirun> <module_system type="none"/> <RUNDIR>$ENV{HOME}/projects/e3sm/scratch/$CASE/run</RUNDIR> <EXEROOT>$ENV{HOME}/projects/e3sm/scratch/$CASE/bld</EXEROOT> <environment_variables> <env name="E3SM_SRCROOT">$SRCROOT</env> </environment_variables> <environment_variables mpilib="mpi-serial"> <env name="NETCDF_PATH">/usr/local/packages/netcdf-serial</env> <env name="PATH">/usr/local/packages/cmake/bin:/usr/local/packages/hdf5-serial/bin:/usr/local/packages/netcdf-serial/bin:$ENV{PATH}</env> <env name="LD_LIBRARY_PATH">/usr/local/packages/szip/lib:/usr/local/packages/hdf5-serial/lib:/usr/local/packages/netcdf-serial/lib</env> </environment_variables> <environment_variables mpilib="!mpi-serial"> <env name="NETCDF_PATH">/usr/local/packages/netcdf-parallel</env> <env name="PNETCDF_PATH">/usr/local/packages/pnetcdf</env> <env name="HDF5_PATH">/usr/local/packages/hdf5-parallel</env> <env name="PATH">/usr/local/packages/cmake/bin:/usr/local/packages/mpich/bin:/usr/local/packages/hdf5-parallel/bin:/usr/local/packages/netcdf-parallel/bin:/usr/local/packages/pnetcdf/bin:$ENV{PATH}</env> <env name="LD_LIBRARY_PATH">/usr/local/packages/mpich/lib:/usr/local/packages/szip/lib:/usr/local/packages/hdf5-parallel/lib:/usr/local/packages/netcdf-parallel/lib:/usr/local/packages/pnetcdf/lib</env> </environment_variables> </machine>
Then add a corresponding compiler
element to cime_config/machines/config_compilers.xml:
<compiler COMPILER="gnu" MACH="singularity"> <HDF5_PATH> $ENV{HDF5_PATH}</HDF5_PATH> <NETCDF_PATH> $(NETCDF_PATH)</NETCDF_PATH> <PNETCDF_PATH> $(PNETCDF_PATH)</PNETCDF_PATH> <ADD_SLIBS> $(shell $(NETCDF_PATH)/bin/nf-config --flibs) -lblas -llapack</ADD_SLIBS> </compiler>
At this point you can run the container
mkdir -p $HOME/projects/e3sm/cesm-inputdata
(this directory is assumed to exist by the machine entry above. Input data will be downloaded here. (could get large)).
singularity shell --hostname singularity e3sm.sif
Singularity> cd <E3SM_SRC_DIR>/cime/scripts
Singularity> ./create_test e3sm_developer
The developer test creates and runs 36 cases. To create a new case separately and run it, for example A_WCYCL1850 with ultra low resolution:
Singularity> ./create_newcase --case master.A_WCYCL1850.ne4_oQU240.baseline —-compset A_WCYCL1850 —-res ne4_oQU240
Singularity> cd master.A_WCYCL1850.ne4_oQU240.baseline
Singularity> ./case.setup
Singularity> ./case.build
Singularity> ./case.submit
For more details on how to create a new case and run it, please refer to E3SM Quick Start