Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 5 Next »

The Design Document page provides a description of the algorithms, implementation and planned testing including unit, verification, validation and performance testing. Please read  Step 1.3 Performance Expectations that explains feature documentation requirements from the performance group point of view. 

Design Document

 Click here for instructions to fill up the table below ......

The first table in Design Document gives overview of this document, from this info the Design Documents Overview page is automatically created.

In the overview table below 4.Equ means Equations and Algorithms, 5.Ver means Verification, 6.Perf - Performance, 7. Val - Validation

  • Equations: Document the equations that are being solved and describe algorithms
  • Verification Plans: Define tests that will be run to show that implementation is correct and robust. Involve unit tests to cover range of inputs as well as benchmarks.
  • Performance expectations: Explain the expected performance impact from this development
  • Validation Plans: Document what process-based, stand-alone component, and coupled model runs will be performed, and with what metrics will be used to assess validity

Use the symbols below (copy and paste) to indicate if the section is in progress or done or not started.

In the table below 4.Equ means Equations and Algorithms, 5.Ver means Verification, 6.Perf - Performance, 7. Val - Validation,   (tick) - competed, (warning) - in progress, (error) - not done

 

Overview table for the owner and an approver of this feature

1.Description

Create a land model benchmarking capability for ACME
2.OwnerForrest M. Hoffman (Unlicensed)
3.Created 
4.Equ(warning)
5.Ver(warning)
6.Perf(warning)
7.Val(warning)
8.ApproverWilliam Riley (Unlicensed), Peter Thornton
9.Approved Date
 Click here for Table of Contents ...

Table of Contents

 

 

 

Title: Create a Land Model Benchmarking Capability for ACME

Requirements and Design

ACME Land  Group

Date:  

Summary

A land model benchmarking package will be implemented, leveraging the ongoing ILAMB development in the RGCM-funded BGC Feedbacks SFA, for routine and systematic assessment of land model fidelity through comparison with observational data. Initially, a recent version of the ILAMB prototype will be installed and tested using ACME Land Model (ALM) output. New model–data comparisons will be added by the team of ACME benchmark developers for evaluating the impact of new feature implementation on the overall scientific performance of ALM in coupled and uncoupled configurations. The Workflow Team will be encouraged to adopt this evolving benchmarking package for operational use for evaluating relevant contemporary land and Earth system model simulations. A next-generation version of ILAMB, currently under development and offering parallel processing capabilities, will be used after v1 model activities are complete.

Requirements

Requirement: Install and test the ILAMB prototype code and processed observational data on rhea.ccs.ornl.gov

Date last modified:  
Contributors: Forrest M. Hoffman (Unlicensed)

A recent version of the ILAMB prototype code (nominally version 1.2.1) and the associated processed observational data will be installed on rhea.ccs.ornl.gov. This package will be tested using sample output from CLM4.0CN, CLM4.5BGC, and a corresponding offline ALM simulation.

Algorithmic Formulations

Design solution: Scoring metrics will be developed as needed for model–data comparison

Date last modified:  
Contributors: Forrest M. Hoffman (Unlicensed)

All new observational data will be accompanied by metrics for comparison with model output. These metrics will be clearly documented in the evolving .

Design and Implementation

Implementation: short-desciption-of-implementation-here

Date last modified: // date
Contributors: (add your name to this list if it does not appear)

 

This section should detail the plan for implementing the design solution for requirement XXX. In general, this section is software-centric with a focus on software implementation. Pseudo code is appropriate in this section. Links to actual source code are appropriate. Project management items, such as svn branches, timelines and staffing are also appropriate. How do we typeset pseudo code?

 

Planned Verification and Unit Testing 

Verification and Unit Testing: short-desciption-of-testing-here

Date last modified:  
Contributors: (add your name to this list if it does not appear)

 

How will XXX be tested? i.e. how will be we know when we have met requirement XXX. Will these unit tests be included in the ongoing going forward?

Planned Validation Testing 

Validation Testing: short-desciption-of-testing-here

Date last modified:
Contributors: (add your name to this list if it does not appear)

 

How will XXX be tested? What observational or other dataset will be used?  i.e. how will be we know when we have met requirement XXX. Will these unit tests be included in the ongoing going forward?

Planned Performance Testing 

Performance Testing: short-desciption-of-testing-here

Date last modified:
Contributors: (add your name to this list if it does not appear)

 

How will XXX be tested? i.e. how will be we know when we have met requirement XXX. Will these unit tests be included in the ongoing going forward?

 

 

  • No labels