1.2. Technical Overview
1.2.1. Prerequisites
1.2.1.1. Minimum System Requirements
UFS applications, models, and components require a UNIX-based operating system (i.e., Linux or MacOS).
Additionally, users will need:
Disk space: ~24GB (11GB for Land DA System [or 6.5GB for Land DA container], 12GB for Land DA data, and ~1GB for staging and output)
26 CPU cores (13 CPUs may be possible, but it has not been tested)
1.2.1.2. Software Prerequisites
The Land DA System requires:
An MPI implementation
A Fortran compiler
Python
Lmod
spack-stack (v1.6.0)
jedi-bundle (Skylab v7.0)
These software prerequisites are pre-installed in the Land DA container and on other Level 1 systems (see below for details). However, users on non-Level 1 systems will need to install them.
Before using the Land DA container, users will need to install Singularity/Apptainer and an Intel MPI (available free here).
1.2.2. Supported Systems for Running Land DA
Four levels of support have been defined for UFS applications, and the Land DA System operates under this paradigm:
Level 1 (Preconfigured): Prerequisite software libraries are pre-built and available in a central location; code builds; full testing of model.
Level 2 (Configurable): Prerequisite libraries are not available in a centralized location but are expected to install successfully; code builds; full testing of model.
Level 3 (Limited-test platforms): Libraries and code build on these systems, but there is limited testing of the model.
Level 4 (Build-only platforms): Libraries and code build, but running the model is not tested.
1.2.2.1. Level 1 Systems
Preconfigured (Level 1) systems for Land DA already have the required external libraries available in a central location via spack-stack and the jedi-bundle (Skylab v7.0). Land DA is expected to build and run out-of-the-box on these systems, and users can download the Land DA code without first installing prerequisite software. With the exception of the Land DA container, users must have access to these Level 1 systems in order to use them. For the most updated information on stack locations, compilers, and MPI, users can check the build and run version files for their machine of choice.
Platform |
Compiler |
MPI |
spack-stack Installation |
jedi-bundle Installation |
---|---|---|---|---|
Hera |
intel/2021.5.0 |
impi/2021.5.1 |
/scratch1/NCEPDEV/nems/role.epic/spack-stack/spack-stack-1.6.0/envs/fms-2024.01/install/modulefiles/Core |
/scratch2/NAGAPE/epic/UFS_Land-DA_Dev/jedi_v7 |
Orion |
intel/2021.9.0 |
impi/2021.9.0 |
/work/noaa/epic/role-epic/spack-stack/orion/spack-stack-1.6.0/envs/fms-2024.01/install/modulefiles/Core |
/work/noaa/epic/UFS_Land-DA_Dev/jedi_v7_stack1.6 |
Hercules |
intel/2021.9.0 |
impi/2021.9.0 |
/work/noaa/epic/role-epic/spack-stack/hercules/spack-stack-1.6.0/envs/fms-2024.01/install/modulefiles/Core |
/work/noaa/epic/UFS_Land-DA_Dev/jedi_v7_hercules |
Container |
intel-oneapi-compilers/2021.10.0 |
intel-oneapi-mpi/2021.9.0 |
/opt/spack-stack/spack-stack-1.6.0/envs/unified-env/install/modulefiles/Core (inside the container) |
/opt/jedi-bundle (inside the container) |
1.2.2.2. Level 2-4 Systems
On non-Level 1 platforms, the Land DA System can be run within a container that includes the prerequisite software; otherwise, the required libraries will need to be installed as part of the Land DA build process. Once these prerequisite libraries are installed, Land DA should build and run successfully. However, users may need to perform additional troubleshooting on Level 3 or 4 systems since little or no pre-release testing has been conducted on these systems.
1.2.3. Code Repositories and Directory Structure
1.2.3.1. Hierarchical Repository Structure
The main repository for the Land DA System is named land-DA_workflow
;
it is available on GitHub at https://github.com/ufs-community/land-DA_workflow.
This umbrella repository uses Git submodules and an app_build.sh
file to pull in code from the appropriate versions of external repositories associated with the Land DA System. Table 1.2 describes the various subrepositories that form the UFS Land DA System.
Land DA Submodule Name |
Repository Name |
Repository Description |
Authoritative Repository URL |
---|---|---|---|
apply_incr.fd |
land-apply_jedi_incr |
Contains code that applies the JEDI-generated DA increment to UFS |
|
ufs_model.fd |
ufs-weather-model |
Repository for the UFS Weather Model (WM). This repository contains a number of subrepositories, which are documented in the WM User’s. |
Note
The prerequisite libraries (including NCEP Libraries and external libraries) are not included in the UFS Land DA System repository. The spack-stack repository assembles these prerequisite libraries. Spack-stack has already been built on preconfigured (Level 1) platforms. However, it must be built on other systems. See the spack-stack Documentation for details on installing spack-stack.
1.2.3.2. File & Directory Structure
The land-DA_workflow
is evolving to follow the NCEP Central Operations (NCO) WCOSS Implementation Standards. When the land-DA_workflow
repository is cloned with the --recursive
argument, the specific GitHub repositories described in /sorc/app_build.sh
are cloned into sorc
. The diagram below illustrates the file and directory structure of the Land DA System. Directories in parentheses () are only visible after the build step. Some files and directories have been removed for brevity.
land-offline_workflow
├── doc
├── (exec)
├── fix
├── jobs
├── (lib*)
├── modulefiles
├── parm
│ ├── jedi
│ ├── templates
│ │ └── template.land_analysis.yaml
│ ├── check_release_outputs.sh
│ ├── detect_platform.sh
│ ├── parm_xml_<platform>.yaml
│ └── launch_rocoto_wflow.sh
├── scripts
├── sorc
| ├── apply_incr.fd
| | └── sorc
| | ├── apply_incr_noahmp_snow.f90
| | └── NoahMPdisag_module.f90
│ ├── (build)
│ ├── cmake
│ ├── (conda)
| | └── envs
| | └── land_da
│ ├── test
│ │ ├── <platform>_ctest.sh
│ │ └── run_<platform>_ctest.sh
│ ├── tile2tile_converter.fd
│ ├── ufs_model.fd
│ ├── CMakeLists.txt
│ └── app_build.sh
├── ush
| ├── fill_jinja_template.py
| ├── hofx_analysis_stats.py
| ├── letkf_create_ens.py
| └── plot_forecast_restart.py
├── versions
├── LICENSE
└── README.md
Table 1.3 describes the contents of the most important Land DA subdirectories. Section 1.2.3.1 describes the Land DA System components. Users may reference the NCO Implementation Standards (p. 19) for additional details on repository structure in NCO-compliant repositories.
Directory Name |
Description |
---|---|
doc |
Repository documentation |
exec |
Binary executables |
fix |
Location of fix/static files |
jobs |
J-job scripts launched by Rocoto |
lib |
Model-specific libraries |
modulefiles |
Files that load the modules required for building and running the workflow |
parm |
Parameter files used to configure the model, physics, workflow, and various components |
scripts |
Scripts launched by the J-jobs |
sorc |
External source code used to build the Land DA System |
ush |
Utility scripts |
versions |
Contains |
1.2.3.3. The UFS Land Component
The UFS Land DA System has been updated to build and run the UFS Noah-MP land component. The land component makes use of a National Unified Operational Prediction Capability (NUOPC) cap to interface with a coupled modeling system. This Noah-MP NUOPC cap is able to create an ESMF multi-tile grid by reading in a mosaic grid file. For the domain, the FMS initializes reading and writing of the cubed-sphere tiled output. Then, the Noah-MP land component reads static information and initial conditions (e.g., surface albedo) and interpolates the data to the date of the simulation. The solar zenith angle is calculated based on the time information.
1.2.3.4. Unified Workflow (UW) Tools
The Unified Workflow (UW) is a set of tools intended to unify the workflow for various UFS applications under one framework. The UW toolkit includes rocoto, template, and configuration (config) tools, and additional tools and drivers are under development. The Land DA workflow makes use of the template tool to fill in user-specified values in the configuration file. It then uses the rocoto tool to generate a workflow XML file from the configuration file; other UW tools may be incorporated into the workflow in the future. More details about UW tools can be found in the uwtools GitHub repository and in the UW Documentation.