Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

  1. cori optimized build has an internal compiler error (in shoc_assumed_pdf.cpp?). Avoid this by building in debug mode.

  2. perlmutter optimized build yields corrupted answers (really hot planet). Avoid this by building in debug mode.

  3. ne120 fails with OOM(?) errors right now when SPA is active. Delete that proc in namelist_scream.xml.

Step 1: Conda environment (*not necessary on NERSC machines)

The SCREAMv1 build requires a handful of python libraries. Creating a conda environment with those packages installed is the easiest way to create an appropriate environment. For example,

...

ne4 (max = 96)

ne30 (max = 5,400)

ne120 (max = 86,400)

ne256 (max = 393,216)

ne512 (max = 1,572,864)

ne1024 (max = 6,291,456)

cori-knl (68 cores/node; 96+16 GB/node)

16x1

NTASKS=675

perlmutter (64 cores/node; 4 GPUs/node; 256 GB/node)

NTASKS=12

syrah (16 cores/node; 64 GB/node)

32x1

160x1

320x1

quartz (36 cores/node; 128 GB/node)

72x1

180x1

360x1

summit (8 cores/node?; 6 GPUs/node; 512+96 GB/node)

...

Code Block
breakoutModewide
${CODE_ROOT}/cime/scripts/create_newcase --case ${CASE_NAME} --compset ${COMPSET} --res ${RES} --pecount ${PECOUNT} --walltime 00:30:00 --queue ${QUEUE}

** For PM: Also specify --compiler gnugpu and --project e3sm_g

Then cd ${CASE_NAME}

Step 4: Change CIME settings (if Desired)

...

Code Block
./xmlchange ATM_NCPL=288
./xmlchange DEBUG=TRUE #debug rather than optimized build.
./xmlchange JOB_QUEUE=pdebug #debug if on cori or perlmutter
./xmlchange JOB_WALLCLOCK_TIME=0:30:00
./xmlchange STOP_OPTION=ndays #how long to run for
./xmlchange STOP_N=1
./xmlchange HIST_OPTION=ndays #how often to write cpl.hi files
./xmlchange HIST_N=1

...

As of , this is done by modifying namelist_scream.xml either by hand or by using the atm-config-chg function which now comes bundled when you create a case. Explore namelist_scream.xml for variables you might want to change (but you shouldn’t have to change anything to run).

** For PM: in env_batch.xml set the --gpu-bind=none for the “gnugpu" compiler

Code Block
<directives compiler="gnugpu">
      <directive> --gpus-per-task=1</directive>
      <directive> --gpu-bind=none</directive>

Step 6: Config/Compile/Run

...

  1. Change Vertical__Coordinate__Filename to use the initial condition file for your new resolution

  2. Change Filename under Initial__Conditions → Physics__GLL subsection to also use that new initial condition file

  3. Change SPA__Remap__File to use one appropriate to map ne30 to your new resolution

  4. Change se_ne as appropriate

  5. change se_tstep and nu_top (recommended defaults for these and dtime are given in the awesome table on the EAM's HOMME Dycore Recommended Settings (THETA) page

** Other options **

Changing output frequency and variables

  1. Modify the ./data/scream_output.yaml file under the run directory

Running with non-hydrostatic dycore

  1. Change to tstep_type=9 (or run ./atm-config-chg tstep_type=9 in case directory)

  2. Change to theta_hydrostatic_mode=False (or run ./atm-config-chg theta_hydrostatic_mode=False)