Ocean/Ice Group Summary & Recap

  • Identified path forward for connecting MPAS analysis & workflow via deployed web app.

  • Interest & enthusiasm around coupled modeling towards coastal processes (inundation, BGC, etc.)

  • Identified a path forward for fixing coupled-model problems related to solid (ice) runoff to ocean

  • Common ground between ocean / land / ice BGC & prioritized list of developments for V1 – V3P

  • Identified path forward for supporting MPAS-O GPU port to Titan (will apply for others as well)

  • Component / coupled-model groups need to have more discussion / interaction (e.g. for identifying / fixing problems like Lab. sea freezing over)

  • We need to push out in prep. component model description papers and get started on science papers we can already start on with existing model capabilities / configurations (e.g., comparison / exploration of ocean-ice, CORE-forced simulations with / without ice shelf cavities)

  • Agreement on common goals / needs for task tracking / reporting (JIRA Deep Dive Breakout)








Speed Dating Summaries

This page summarizes discussions during the speed-dating sessions with the ocean-ice team (where particular to-do items were requested, the relevant persons name has been flagged with an @).

Coupler-SE

  • discussion regarding CIME5 integration and how to organize the timing of that with other bug fixes ocean-ice team needs to get in

  • life after Doug ... how are we doing?

  • SE side is doing ok (Jim F. filling in for Git expertise)

  • Jon W. filling in integrator role for ocean-ice team; Mark P. coming up to speed

  • discussion of use of slack more widely (e.g., project-wide vs. just within SE team)

  • Rob J. says there is already a "channel" per group, but discussion that is at level of "help ticket" should still go through triage hub still

  • Robert Jacob will send out ACME-all email regarding wider use of slack

  • Andy S. brings up trying to move MPAS builds over to Cmake; we are trying it out, it has been discussed on MPAS telecons (lost some note here due to crap internet connection)

  • Rob J. demoed use of slack for the rest of the group

Land

  • discussion about interfacing of ocean and land in terms of coastal processes (and Terrestrial Aquatic Interface; TAI)

  • while ACME is the model coastal processes will be built around / will serve, it's not clear that we have the staff / bandwidth within ACME to take this on

  • questions about whether or not MOSART can be used at the coastal interface or if something that more closely unifies the ocean and shallow-water flow is needed

  • there is sediment transport model work going on in MOSART

  • saltwater penetration of water table important to look at

  • Keith M. – much more than just sediment transport needs to be considered (Si, DOC, N, P, etc. ... all important nutrient fluxes to ocean important)

Atmosphere

  • Phil R. – discussion of surface mass balance assessment ... since atmosphere model is starting to settle down; model is tuned at 0-order

  • Jeremy F. / Phil R. – discussion of how long of a run we need to make an assessment; do we need fully coupled (because sea ice / ocean conditions are also important)

  • Phil R. – F-case, start w/ 5 yr runs using dermatological SSTs; then move from there to 10 yr AMIP simulations

  • Phil J. to Rich N. – this is an area of possible interest to LIWG and ice-oceans teams from CESM / ACME

  • Chris G. – there is already output from one AMIP style run to analyze ... is there someone on ice/ocean team to analyze that? But this output was produced w/ buggy code

  • Phil J. – high-lat biases are likely to exist regardless of bug in atmos code

  • Todd R. – broader question for what our plan is for evaluating / improving high-lat, coupled model biases?

  • Chris G. – do we have tier-1 diagnostics for sfc energy balance?

  • Rich N. – advice is to look at the coupled model right up front (e.g. Labrador Sea freezing-over problem was function of atmos., ocean, sea-ice, and land)

  • Susannah B. – tier1b S. Ocean diagnostics are almost working with workflow tool

  • Phil J. – Xuben Z. is available to help w/ analysis of coupled model w.r.t., e.g. diurnal cycle of SSTs and impacts on atmos. convection, etc.

  • Chris G. – whoever on ocean team is interested in tropical cyclones needs to make sure Chris knows what variables he needs to save for this analysis; new coupled model simulation starting next week

Workflow

  • Todd R. – how do we link / leverage workflow tools that we saw demonstrated yesterday?

  • Dean W. – request for liaison with workflow from ocean-ice group; what can workflow do to help ocean-ice team?

  • Milena V. – what tool is Chris G. using?

  • Philip W. – how can we hook our analysis framework into the web based one we saw yesterday?

  • Dean W. – CDP (comm. diag. package) can do this (or eventually will be able to)

  • long discussion between Philip W., Milena V., and Dean W. about possibilities for interfacing MPAS analysis with workflow group's tools

  • Dean W. requests that Philip W. writes down a list of requests for workflow team to support this, and also some potential early adopters to test out

  • Todd R. – our prev. idea of having interactive ipython notebooks, which could also be exported as py scripts, allowing for both interactive and batch analysis ... did not exactly work out. Would be great if workflow could help us fill gaps left from that failure.

  • Some chatter between Charlie Z. and Todd R. about methods for compressing high-freq output for use in making movies

Coupled

  • blah

Performance

  • Todd R. to perf. team – where are the opportunities to make our models faster?

  • Phil J. – what are we targeting for hi-res ... 15to5km or 18to6km?; Pat. W. has no experience with 18to6km

  • Pat W. – atmos. and ice are the bottleneck, not the ocean ... so ice perf. will decide (?)

  • Pat W. – Titan is currently the only machine where ocean-ice perf. on each node can be optimized (by allocating procs. differently for each?)

  • Mat M. – says that recent discussions indicate that we'll probably be using the 18to6km for hi-res rather than the 15to5km

  • Oz has been using diff. coupling freq. than Pat W. (to what end?)

  • Phil J. – perf. team needs to know the final configuration (resolution, coupling freq. ASAP) so they can benchmark / improve perf. for that

  • Adrian T. – thought we already decided on the coupling freq.; don't change from what we previously decided

  • Phil J. – Matt N. has been starting work on gpu port (for MPAS-O); this will lead to multiple versions of MPAS-O ... how does MPAS team want to handle this?

  • Todd R. response – doesn't matter too much as long as you aren't touching framework

  • Matt N. – we will not be touching framework, but probably need two sources codes for CPU vs. GPU ... how do we avoid him having to do the port to GPUs multiple times?

  • wider discussion – what is process for merging changes if we have multiple code bases for GPUs, CPUs, MICs, etc.?

  • Todd R. / others – can we have multiple F90 files in repo (on master) for use on diff. arch? Then MPAS team is responsible for making / controlling changes in multiple versions of a single F90 file (?)

  • Phil J. – pushing loops down in code may requiring refactoring of CVMIX

  • Mark P. – requests clearer updating / posting of existing perf. on various machines to Confluence

  • Philip Jones, Patrick Worley (Unlicensed) to tidy this up and point us to it? E.g., we add a "quick link" to this that is easy to find from the perf. home page

  • Phil J. – MPAS team should anticipate possibly significant changes to code structure for GPU port

  • questions on why threading perf. improvements seen in stand-alone testing are not translating through to perf. improvements in the fully coupled system (e.g. CVMIX), with particular reference to work of Abhinav; Pat W. notes this is consistent with past work as well