E3SM Code Development Process for Collaborators

The goal of the E3SM Code Development Process is to achieve high developer productivity towards the improvement of the E3SM. A degree of formality is needed because of the large size of the E3SM team, the geographic distribution of the team, and large scope of the ACME project, particularly when including collaborations.

The process is meant to promote the quality of the E3SM model along many dimensions: high-quality science, verified implementations, high-performing implementations, portability to DOE computer architectures, and maintainability/extensibility by those other then the original E3SM. The process intends to give individuals and small teams the ability to independently develop new features and algorithms, yet with a clear path to incorporation into the E3SM model. The expectation is that collaborators (as well as E3SM developer team members) achieve a level of expertise in code practice that can be sustained and maintained across the team without requiring intervention. 

Main steps of the E3SM code development process:

  1. Code development should begin from the head of the master branch or most recent maint branch of E3SM from the github repository
  2. Code should be of high enough quality to be readable, modifiable, and maintainable by others. 
  3. Code development should follow E3SM standards for using git, e.g. preparing a separate commit for each stand-alone contribution along with a well-formed commit message.
  4. Developers must run the E3SM developer test suite to verify that their development has not unintentionally changed the code behavior, and all new development must be accompanied by tests to protect the new feature against future developments.
  5. See Speculative Long-term Development if you will be working for a while on your changes.
  6. For eventual inclusion of a new feature or algorithm into the E3SM Model, documentation of the main steps of the E3SM Code Review process should be created:
    1. A design document (or paper) detailing the equations/algorithms that are being implemented
    2. Verification evidence that supports that the implementation is correct
    3. Performance analysis and data showing the expected and measured performance impact of the new feature
    4. Validation evidence that the feature matches observational data