Skip to content

Commit

Permalink
Add sfs as valid system (#3243)
Browse files Browse the repository at this point in the history
Adds sfs as a valid option for NET.

To start, the GEFS system is generally just copied wholesale for SFS.
This includes the extract_vars job.

Other than base and resources, config files link to the GEFS versions,
just as GEFS config files point to the GFS versions except where they
have needed to be changed.

The temporary SFS_POST option has been removed.

The existing SFS test is copied and slightly modified for a PR-level CI
test.

Resolves #2271
  • Loading branch information
WalterKolczynski-NOAA authored Feb 19, 2025
1 parent 012c5ea commit 2fdc2f0
Show file tree
Hide file tree
Showing 58 changed files with 2,082 additions and 357 deletions.
Original file line number Diff line number Diff line change
@@ -1,22 +1,19 @@
experiment:
system: gefs
system: sfs
mode: forecast-only

arguments:
idate: 1994050100
edate: 1994050100
pslot: {{ 'pslot' | getenv }}
app: S2SWA
app: S2S
resdetatmos: 96
resdetocean: 1.0
resensatmos: 96
resdetocean: 1
start: 'cold'
nens: 2
interval: 6
start: warm
comroot: {{ 'RUNTESTS' | getenv }}/COMROOT
expdir: {{ 'RUNTESTS' | getenv }}/EXPDIR
idate: 2020110100
edate: 2020110100
yaml: {{ HOMEgfs }}/ci/cases/yamls/gefs_replay_ci.yaml
icsdir: {{ 'ICSDIR_ROOT' | getenv }}/C96mx100/20240610
yaml: {{ HOMEgfs }}/ci/cases/yamls/sfs_defaults.yaml

skip_ci_on_hosts:
- None
8 changes: 4 additions & 4 deletions ci/cases/sfs/C96mx100_S2S.yaml
Original file line number Diff line number Diff line change
@@ -1,9 +1,9 @@
experiment:
system: gefs
system: sfs
mode: forecast-only

arguments:
idate: 1994050100
idate: 1994050100
edate: 1994050100
pslot: {{ 'pslot' | getenv }}
app: S2S
Expand All @@ -14,6 +14,6 @@ arguments:
nens: 10
comroot: {{ 'RUNTESTS' | getenv }}/COMROOT
expdir: {{ 'RUNTESTS' | getenv }}/EXPDIR
icsdir: {{ 'TOPICDIR' | getenv }}/HR4/C96mx100
yaml: {{ HOMEgfs }}/ci/cases/yamls/sfs_defaults.yaml
icsdir: {{ 'ICSDIR_ROOT' | getenv }}/C96mx100/20240610
yaml: {{ HOMEgfs }}/ci/cases/yamls/sfs_full.yaml

5 changes: 2 additions & 3 deletions ci/cases/yamls/sfs_defaults.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -9,19 +9,18 @@ base:
DO_AWIPS: "NO"
KEEPDATA: "NO"
DO_EXTRACTVARS: "NO"
FHMAX_GFS: 2976
FHMAX_GFS: 144
FHMAX_HF_GFS: 0
FHOUT_HF_GFS: 1
FHOUT_GFS: 24
FHOUT_OCN_GFS: 24
FHOUT_ICE_GFS: 24
FCST_BREAKPOINTS: ""
FCST_BREAKPOINTS: "48,96"
REPLAY_ICS: "NO"
USE_OCN_ENS_PERTURB_FILES: "YES"
USE_ATM_ENS_PERTURB_FILES: "YES"
HPSSARCH: "NO"
LOCALARCH: "NO"
SFS_POST: "YES"
ACCOUNT: {{ 'HPC_ACCOUNT' | getenv }}
fcst:
TYPE: "hydro"
Expand Down
31 changes: 31 additions & 0 deletions ci/cases/yamls/sfs_full.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,31 @@
base:
DO_JEDIATMVAR: "NO"
DO_JEDIATMENS: "NO"
DO_JEDIOCNVAR: "NO"
DO_JEDISNOWDA: "NO"
DO_MERGENSST: "NO"
DO_BUFRSND: "NO"
DO_GEMPAK: "NO"
DO_AWIPS: "NO"
KEEPDATA: "NO"
DO_EXTRACTVARS: "NO"
FHMAX_GFS: 2976
FHMAX_HF_GFS: 0
FHOUT_HF_GFS: 1
FHOUT_GFS: 24
FHOUT_OCN_GFS: 24
FHOUT_ICE_GFS: 24
FCST_BREAKPOINTS: ""
REPLAY_ICS: "NO"
USE_OCN_ENS_PERTURB_FILES: "YES"
USE_ATM_ENS_PERTURB_FILES: "YES"
HPSSARCH: "NO"
LOCALARCH: "NO"
ACCOUNT: {{ 'HPC_ACCOUNT' | getenv }}
fcst:
TYPE: "hydro"
MONO: "mono"
reforecast: "YES"
FHZER: 24
ocn:
MOM6_INTERP_ICS: "YES"
16 changes: 12 additions & 4 deletions docs/source/clone.rst
Original file line number Diff line number Diff line change
Expand Up @@ -20,11 +20,11 @@ Clone the `global-workflow` and `cd` into the `sorc` directory:

.. _build_examples:

The build_all.sh script can be used to build all required components of the global workflow. The accepted arguments is a list of systems to be built. This includes builds for GFS and GEFS forecast-only experiments, GSI and GDASApp-based DA for cycled GFS experiments. See `feature availability <hpc.html#feature-availability-by-hpc>`__ to see which system(s) are available on each supported system.
The build_all.sh script can be used to build all required components of the global workflow. The accepted arguments is a list of systems to be built. This includes builds for GFS, GEFS, and SFS forecast-only experiments, GSI and GDASApp-based DA for cycled GFS experiments. See `feature availability <hpc.html#feature-availability-by-hpc>`__ to see which system(s) are available on each supported system.

::

./build_all.sh [gfs] [gefs] [gs] [gdas] [all]
./build_all.sh [gfs] [gefs] [sfs] [gsi] [gdas] [all]

For example, to run GFS experiments with GSI DA, execute:

Expand All @@ -34,7 +34,7 @@ For example, to run GFS experiments with GSI DA, execute:

This builds the GFS, UFS-utils, GFS-utils, WW3 with PDLIB (structured wave grids), UPP, GSI, GSI-monitor, and GSI-utils executables.

For coupled cycling (include new UFSDA) execute:
For coupled cycling (using only new UFSDA) execute:

::

Expand All @@ -50,6 +50,14 @@ To run GEFS (forecast-only) execute:

This builds the GEFS, UFS-utils, GFS-utils, WW3 *without* PDLIB (unstructure wave grids), and UPP executables.

To run SFS (forecast-only) execute:

::

./build_all.sh sfs

This builds the same components as GEFS, except the UFS model is built in hydrostatic mode.

Once the building is complete, link workflow artifacts such as executables, configuration files, and scripts via

::
Expand Down Expand Up @@ -121,7 +129,7 @@ Under the ``/sorc`` folder is a script to build all components called ``build_al
-v:
Execute all build scripts with -v option to turn on verbose where supported

Lastly, pass to build_all.sh a list of systems to build. This includes `gfs`, `gefs`, `sfs` (not fully supported), `gsi`, `gdas`, and `all`.
Lastly, pass to build_all.sh a list of systems to build. This includes `gfs`, `gefs`, `sfs`, `gsi`, `gdas`, and `all`.

For examples of how to use this script, see :ref:`build examples <build_examples>`.

Expand Down
2 changes: 1 addition & 1 deletion docs/source/development.rst
Original file line number Diff line number Diff line change
Expand Up @@ -79,7 +79,7 @@ Continuous Integration (CI)

The global workflow comes fitted with a suite of system tests that run various types of workflow. These tests are commonly run for pull requests before they may be merged into the develop branch. At a minimum, developers are expected to run the CI test(s) that will be impacted by their changes on at least one platform.

The commonly run tests are written in YAML format and can be found in the ``ci/cases/pr`` directory. The ``workflow/generate_workflows.sh`` tool is available to aid running these cases. See the help documentation by running ``./generate_workflows.sh -h``. The script has the capability to prepare the EXPDIR and COMROOT directories for a specified or implied suite of CI tests (see :doc:`setup` for details on these directories). The script also has options to automatically build and run all tests for a given system (i.e. GFS or GEFS and a placeholder for SFS). For instance, to build the workflow and run all of the GFS tests, one would execute
The commonly run tests are written in YAML format and can be found in the ``ci/cases/pr`` directory. The ``workflow/generate_workflows.sh`` tool is available to aid running these cases. See the help documentation by running ``./generate_workflows.sh -h``. The script has the capability to prepare the EXPDIR and COMROOT directories for a specified or implied suite of CI tests (see :doc:`setup` for details on these directories). The script also has options to automatically build and run all tests for a given system (i.e. GFS, GEFS or SFS). For instance, to build the workflow and run all of the GFS tests, one would execute

::

Expand Down
13 changes: 13 additions & 0 deletions docs/source/hpc.rst
Original file line number Diff line number Diff line change
Expand Up @@ -60,6 +60,8 @@ The Global Workflow provides capabilities for deterministic and ensemble forecas
GFS
- Coupled
GEFS
- Coupled
SFS
- GSI
DA
- GDASApp
Expand All @@ -78,6 +80,7 @@ The Global Workflow provides capabilities for deterministic and ensemble forecas
- 1
- X
- X
- X
- X
- X
-
Expand All @@ -91,6 +94,7 @@ The Global Workflow provides capabilities for deterministic and ensemble forecas
- 1
- X
- X
- X
- X
- X
- X
Expand All @@ -104,6 +108,7 @@ The Global Workflow provides capabilities for deterministic and ensemble forecas
- 1
- X
- X
- X
- X
- X
- X
Expand All @@ -117,6 +122,7 @@ The Global Workflow provides capabilities for deterministic and ensemble forecas
- 2
- X
- X
- X
- X
- X
-
Expand All @@ -130,6 +136,7 @@ The Global Workflow provides capabilities for deterministic and ensemble forecas
- 3
- X
- X
- X
- X
- X
-
Expand All @@ -143,6 +150,7 @@ The Global Workflow provides capabilities for deterministic and ensemble forecas
- 3
- X
- X
- X
- X
- X
-
Expand All @@ -156,6 +164,7 @@ The Global Workflow provides capabilities for deterministic and ensemble forecas
- 3
- X
- X
- X
- X
-
-
Expand All @@ -169,6 +178,7 @@ The Global Workflow provides capabilities for deterministic and ensemble forecas
- 3
- X
- X
- X
-
-
-
Expand All @@ -182,6 +192,7 @@ The Global Workflow provides capabilities for deterministic and ensemble forecas
- 3
- X
- X
- X
-
-
-
Expand All @@ -195,6 +206,7 @@ The Global Workflow provides capabilities for deterministic and ensemble forecas
- 3
- X
-
-
- X
-
-
Expand All @@ -208,6 +220,7 @@ The Global Workflow provides capabilities for deterministic and ensemble forecas
- 3
-
-
-
- X
-
-
Expand Down
2 changes: 1 addition & 1 deletion docs/source/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
Global Workflow
###############

**Global-workflow** is the end-to-end workflow designed to run global configurations of medium range weather forecasting for the UFS weather model. It supports both development and operational implementations. In its current format it supports the Global Forecast System (GFS) and the Global Ensemble Forecast System (GEFS) configurations
**Global-workflow** is the end-to-end workflow designed to run global configurations of medium range weather forecasting for the UFS weather model. It supports both development and operational implementations. In its current format it supports the Global Forecast System (GFS), Global Ensemble Forecast System (GEFS), and Subseasonal Forecast System (SFS) configurations.

======
Status
Expand Down
16 changes: 7 additions & 9 deletions parm/archive/gefs_arcdir.yaml.j2
Original file line number Diff line number Diff line change
Expand Up @@ -24,15 +24,13 @@
{% endif %}

# Select ensstat files to copy to the arcdir
{% if RUN == "gefs" %}
{% set ensstat_files = [] %}
{% if path_exists(COMIN_ATMOS_ENSSTAT_1p00) %}
{% for fhr in range(ofst_hr, FHMAX_GFS + FHOUT_GFS, FHOUT_GFS) %}
{% do ensstat_files.append([COMIN_ATMOS_ENSSTAT_1p00 ~ "/" ~ head ~ "mean.pres_." ~
"1p00" ~ ".f" ~ '%03d'|format(fhr) ~ ".grib2",
GEFS_ARCH]) %}
{% endfor %}
{% endif %}
{% set ensstat_files = [] %}
{% if path_exists(COMIN_ATMOS_ENSSTAT_1p00) %}
{% for fhr in range(ofst_hr, FHMAX_GFS + FHOUT_GFS, FHOUT_GFS) %}
{% do ensstat_files.append([COMIN_ATMOS_ENSSTAT_1p00 ~ "/" ~ head ~ "mean.pres_." ~
"1p00" ~ ".f" ~ '%03d'|format(fhr) ~ ".grib2",
GEFS_ARCH]) %}
{% endfor %}
{% endif %}
{% set file_set = ensstat_files %}
# Actually write the yaml
Expand Down
13 changes: 4 additions & 9 deletions parm/config/gefs/config.atmos_products
Original file line number Diff line number Diff line change
Expand Up @@ -25,14 +25,9 @@ fi
export FLXGF="NO" # Create interpolated sflux.1p00 file

# paramlist files for the different forecast hours and downsets
if [[ ${SFS_POST} == "YES" ]]; then
export post_prefix='sfs'
else
export post_prefix='gefs'
fi
export paramlista="${PARMgfs}/product/${post_prefix}.0p25.fFFF.paramlist.a.txt"
export paramlista_anl="${PARMgfs}/product/${post_prefix}.0p25.anl.paramlist.a.txt"
export paramlista_f000="${PARMgfs}/product/${post_prefix}.0p25.f000.paramlist.a.txt"
export paramlistb="${PARMgfs}/product/${post_prefix}.0p25.fFFF.paramlist.b.txt"
export paramlista="${PARMgfs}/product/${NET}.0p25.fFFF.paramlist.a.txt"
export paramlista_anl="${PARMgfs}/product/${NET}.0p25.anl.paramlist.a.txt"
export paramlista_f000="${PARMgfs}/product/${NET}.0p25.f000.paramlist.a.txt"
export paramlistb="${PARMgfs}/product/${NET}.0p25.fFFF.paramlist.b.txt"

echo "END: config.atmos_products"
1 change: 0 additions & 1 deletion parm/config/gefs/config.base
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,6 @@ export REALTIME="YES"

# Experiment mode (cycled or forecast-only)
export MODE="@MODE@" # cycled/forecast-only
export SFS_POST="@SFS_POST@" # TODO, place holder until RUN=SFS is developed
export DO_TEST_MODE="@DO_TEST_MODE@" # option to change configuration for automated testing

####################################################
Expand Down
6 changes: 1 addition & 5 deletions parm/config/gefs/config.fcst
Original file line number Diff line number Diff line change
Expand Up @@ -51,11 +51,7 @@ export esmf_logkind="ESMF_LOGKIND_MULTI_ON_ERROR" #Options: ESMF_LOGKIND_MULTI_O

export FORECASTSH="${SCRgfs}/exglobal_forecast.sh"
#export FORECASTSH="${SCRgfs}/exglobal_forecast.py" # Temp. while this is worked on
if [[ "${SFS_POST:-}" == "YES" ]]; then
export FCSTEXEC="sfs_model.x"
else
export FCSTEXEC="gefs_model.x"
fi
export FCSTEXEC="${NET}_model.x"

#######################################################################
# Model configuration
Expand Down
16 changes: 8 additions & 8 deletions parm/config/gefs/config.resources
Original file line number Diff line number Diff line change
Expand Up @@ -312,14 +312,14 @@ case ${step} in
;;

"extractvars")
export walltime_gefs="00:30:00"
export ntasks_gefs=1
export threads_per_task_gefs=1
export tasks_per_node_gefs="${ntasks_gefs}"
export walltime_gfs="${walltime_gefs}"
export ntasks_gfs="${ntasks_gefs}"
export threads_per_tasks_gfs="${threads_per_task_gefs}"
export tasks_per_node_gfs="${tasks_per_node_gefs}"
export walltime="00:30:00"
export ntasks=1
export threads_per_task=1
export tasks_per_node="${ntasks}"
export walltime_gfs="${walltime}"
export ntasks_gfs="${ntasks}"
export threads_per_tasks_gfs="${threads_per_task}"
export tasks_per_node_gfs="${tasks_per_node}"
export is_exclusive=False
;;

Expand Down
Loading

0 comments on commit 2fdc2f0

Please sign in to comment.