WACCM-X

The Community Atmosphere Model (CAM) extends throughout Earth’s troposphere. The Whole Atmosphere Community Climate Model (WACCM) extends CAM further into the stratosphere and thermosphere. The Whole Atmosphere Community Climate Model - eXtended (WACCM-X) is an extension of WACCM that extends upward to ~500 km altitude and includes the ionosphere.

The overview paper of WACCM-X 2.0 (Liu et al. 2018 1 ) uses the finite volume dynamical core. The article also mentions that WACCM-X is based on CAM-4 physics and uses the f19 atmospheric grid which has a horizontal resolution of 1.9° in latitude and 2.5° in longitude.

Important

It may not be possible to compile WACCM-X with the spectral element dycore. The safest thing to do would be to build a test case of the model with a finite volume dycore first.

Tutorial

The WACCM-X tutorial demonstrates how to build a WACCM-X case.

cd /glade/work/johnsonb/cesm2_2_0/cime/scripts
./create_newcase --res f19_f19 --compset FXHIST --case /glade/work/johnsonb/cases/f.e20.FXHIST.f19_f19.001 --mach cheyenne --project $DARES_PROJECT --run-unsupported
cd /glade/work/johnsonb/cases/f.e20.FXHIST.f19_f19.001

Note

Liu et al. (2018) note that the time steps for WACCM-X and CAM are significantly different. For example, CAM’s time step for the f19_f19 is 30 minutes, while it is 5 minutes for WACCM-X. Lauritzen et al. (2017) 2 note that the timestep for the roughly ~2° spectral element grid, ne16np4, is also 30 minutes.

First attempt at building FHISTX for the ne16 grid

cd /glade/work/johnsonb/cesm2_2_0/cime/scripts
./create_newcase --res ne16_g17 --compset FXHIST --case /glade/work/johnsonb/cases/f.e20.FXHIST.ne16_g17.001 --mach cheyenne --project $DARES_PROJECT --run-unsupported
cd /glade/work/johnsonb/cases/f.e20.FXHIST.ne16_g17.001
./case.setup
./case.build

This results in the following error:

Error

ERROR: Command: ‘/glade/work/johnsonb/cesm2_2_0/components/cam/bld/configure -s -fc_type intel -dyn se -hgrid ne16np4 -cpl mct -usr_src /glade/work/johnsonb/cases/f.e20.FXHIST.ne16_g17.001/SourceMods/src.cam -spmd -nosmp -ocn docn -phys cam6 -waccmx -ionosphere wxie -chem waccm_ma_mam4’ failed with error ‘ERROR: Ionosphere is only available for FV dycore’ from dir ‘/glade/work/johnsonb/cases/f.e20.FXHIST.ne16_g17.001/Buildconf/camconf’

This limitation in capability is also reflected in ACOM’s geospace roadmap.

Second attempt at building FHISTX for the ne16 grid

This CAM pull request suggests that the limitation in the ionosphere was fixed for a more-recent version of CAM for CESM2.3. Checkout a newer release of CESM and try again.

cd /glade/work/johnsonb
git clone https://github.com/ESCOMP/CESM cesm2_3_0
cd cesm2_3_0
git checkout cesm2_3_beta01
./manage_externals/checkout_externals
[ ... ]
cd cime/scripts
./create_newcase --res ne16_g17 --compset FXHIST --case /glade/work/johnsonb/cases/f.e20.FXHIST.ne16_g17.002 --mach cheyenne --project $DARES_PROJECT --run-unsupported
cd /glade/work/johnsonb/cases/f.e20.FXHIST.ne16_g17.002
./case.setup

Error

ERROR:  Ionosphere is only available for FV dycore

Try again to checkout a newer tag.

cd /glade/work/johnsonb
rm -rf cesm2_3_0
git clone https://github.com/ESCOMP/CESM cesm2_3_0
cd cesm2_3_0
git checkout cesm2_3_beta09
./manage_externals/checkout_externals
[ ... ]
cd cime/scripts
./create_newcase --res ne16_g17 --compset FXHIST --case /glade/work/johnsonb/cases/f.e20.FXHIST.ne16_g17.003 --mach cheyenne --project $DARES_PROJECT --run-unsupported

Error

SyntaxError: invalid syntax

Try again by checking out a slightly older tag.

git clone https://github.com/ESCOMP/CESM cesm2_3_0
cd cesm2_3_0
git checkout cesm2_3_beta09
./manage_externals/checkout_externals
[ ... ]
cd cime/scripts
./create_newcase --res ne16_g17 --compset FXHIST --case /glade/work/johnsonb/cases/f.e20.FXHIST.ne16_g17.004 --mach cheyenne --project $DARES_PROJECT --run-unsupported
ERROR: Python 3, minor version 6 is required, you have 3.4
source activate py37
cd /glade/work/johnsonb/cases/f.e20.FXHIST.ne16_g17.004
./case.setup
./case.build

Error

ERROR: Command /glade/work/johnsonb/cesm2_3_0/components/clm/bld/build-namelist failed rc=255 out= err=ERROR : CLM build-namelist::CLMBuildNamelist::add_default() : No default value found for flanduse_timeseries. Are defaults provided for this resolution and land mask?

Well this is progress.

Doing a triage of which beta releases of cesm2_3_0 provide the most plausible path toward compilation.

Failed attempts until success with the ne30 grid

This might be simple to fix. According to this CGD BB post, It could merely be that there is a missing timeseries file that CLM needs.

Create a stock FHIST case and see how this is specified.

cd /glade/work/johnsonb/cesm2_1_3/cime/scripts
export CASEROOT='/glade/work/johnsonb/cases/f.e213.FHIST.f09_g17.001'
./create_newcase --res f09_g17 --compset FHIST --case $CASEROOT --mach cheyenne --project $DARES_PROJECT --run-unsupported
cd $CASEROOT
./case.setup
./preview_namelists
grep -Rl flanduse_timeseries ./
./Buildconf/clmconf/lnd_in
./Buildconf/clm.input_data_list
./CaseDocs/lnd_in

There is no lnd_in file for the ne16_g17 cases. I attempted to set up a case with the ne16_g17 grid and the FHIST compset (instead of FXHIST) and ran into the same error. However, it was possible to build the namelist for a case with the ne30_g17 grid and the FHIST compset.

cd /glade/work/johnsonb/cesm2_2_0/cime/scripts
export CASEROOT='/glade/work/johnsonb/cases/f.e220.FHIST.ne30_g17.001'
./create_newcase --res ne30_g17 --compset FHIST --case $CASEROOT --mach cheyenne --project $DARES_PROJECT --run-unsupported
cd $CASEROOT
./case.setup
./preview_namelists
grep -Rl flanduse_timeseries ./
./Buildconf/clmconf/lnd_in
./Buildconf/clm.input_data_list
./CaseDocs/lnd_in

Important

The key here to realize is that most of the spectral element dycore work is done on the ne30 grid (approximately 1° horizontal resolution) while most of the WACCM-X work is done on the f19 grid (approximately 2° horizontal resolution and the finite volume analog of the ne16 spectral element grid). The question now is: can a case be built using the ne30_g17 grid and the FXHIST compset?

cd /glade/work/johnsonb/git/cesm2_3_0_beta09/cime/scripts
export CASEROOT='/glade/work/johnsonb/cases/f.e230b9.FXHIST.ne30_g17.001'
./create_newcase --res ne30_g17 --compset FXHIST --case $CASEROOT --mach cheyenne --project $DARES_PROJECT --run-unsupported
cd $CASEROOT
./case.setup
./preview_namelists
grep -Rl flanduse_timeseries ./
./Buildconf/clmconf/lnd_in
./Buildconf/clm.input_data_list
./CaseDocs/lnd_in
./case.build
MODEL BUILD HAS FINISHED SUCCESSFULLY

Note

Hooray! However, I don’t know where the preprocessed source files are contained.

There is a list of the source files in /glade/scratch/johnsonb/f.e230b9.FXHIST.ne30_g17.001/bld/atm/obj/Srcfiles but I don’t know where the files actually are.

For example one of the files is cam_history.F90:

cd /glade/scratch/johnsonb/f.e230b9.FXHIST.ne30_g17.001
find . -name cam_history.F90
[ Returns nothing ]
cd /glade/work/johnsonb/cases/f.e230b9.FXHIST.ne30_g17.001
find . -name cam_history.F90
[ Returns nothing ]

Getting the compiler to save the post-preprocessed files

While there is a directory for build_scripts in cime/src/build_scripts, each of the scripts in that subdirectory import CIME.buildlib which is in cime/scripts/lib/CIME/buildlib.py.

buildlib.py

This python script contains three functions: parse_input, build_cime_component_lib and run_gmake. The last function actually invokes gmake to build a component executable. The tractable path forward seems to be to see if we can get these functions to preprocess the files and save them.

Editing buildlib.py to print the commands within run_gmake:

vim /glade/work/johnsonb/git/cesm2_3_0_beta09/cime/scripts/lib/CIME/buildlib.py
102     print('BKJ inserted this: ', cmd)
103     stat, out, err = run_cmd(cmd, combine_output=True)

Appending -E to the compiler flags

Helen’s suggestion at the 2022-08-30 standup was to append -E as a compiler flag in /glade/work/johnsonb/git/cesm2_3_0_beta09/cime/config/cesm/machines/config_compilers.xml.

903 <compiler MACH="cheyenne" COMPILER="intel">
904   <CFLAGS>
905     <append> -qopt-report -xCORE_AVX2 -no-fma -E</append>
906   </CFLAGS>
907   <FFLAGS>
908     <append> -qopt-report -xCORE_AVX2 -no-fma -E</append>
909   </FFLAGS>
[ ... ]
917 </compiler>

Tried this both with cesm2_3_0_beta09 and cesm2_3_0_beta02 and it doesn’t work:

Error

ERROR: /glade/work/johnsonb/git/cesm2_3_0_beta02/cime/src/build_scripts/buildlib.gptl FAILED, cat /glade/scratch/johnsonb/f.e230b2.FXHIST.ne30_g17.001/bld/gptl.bldlog.220831-140809

Appending -save-temps to the compiler flags

This page within the iFort guide suggests that the -save-temps compile flag will save the preprocessed files.

vim /glade/work/johnsonb/git/cesm2_3_0_beta09/cime/config/cesm/machines/config_compilers.xml

903 <compiler MACH="cheyenne" COMPILER="intel">
904   <CFLAGS>
905     <append> -qopt-report -xCORE_AVX2 -no-fma -save-temps</append>
906   </CFLAGS>
907   <FFLAGS>
908     <append> -qopt-report -xCORE_AVX2 -no-fma -save-temps</append>
909   </FFLAGS>
[ ... ]
917 </compiler>

Then build the case:

cd /glade/work/johnsonb/git/cesm2_3_0_beta09/cime/scripts
export CASEROOT='/glade/work/johnsonb/cases/f.e230b9.FXHIST.ne30_g17.004'
./create_newcase --res ne30_g17 --compset FXHIST --case $CASEROOT --mach cheyenne --project $DARES_PROJECT --run-unsupported
cd $CASEROOT
./case.setup
./case.build
[ ... ]
MODEL BUILD HAS FINISHED SUCCESSFULLY
cd /glade/scratch/johnsonb/f.e230b9.FXHIST.ne30_g17.004/bld/atm/obj
ls *.i90
# This shows all of the post-preprocessed files.

References

1

Liu, H.-L., and Coauthors, 2018: Development and Validation of the Whole Atmosphere Community Climate Model With Thermosphere and Ionosphere Extension (WACCM-X 2.0). Journal of Advances in Modeling Earth Systems, 10, 381–402, https://doi.org/10.1002/2017MS001232.

2

Lauritzen, P. H., and Coauthors, 2018: NCAR Release of CAM-SE in CESM2.0: A Reformulation of the Spectral Element Dynamical Core in Dry-Mass Vertical Coordinates With Comprehensive Treatment of Condensates and Energy. Journal of Advances in Modeling Earth Systems, 10, 1537–1570, https://doi.org/10.1029/2017MS001257.