General information
SPS stands for Surface Prediction System. It was developed by ECCC. It executes only the surface routines from GEM, driving it with the following external driving data:
- Surface pressure (P0)
- Surface temperature (TT) (at 2 m but preferably on the lowest predictive model level)
- Surface humidity (specific (HU) or relative (HR)) (at 2 m but preferably on the lowest predictive model level)
- Surface u- and v-winds (UU and VV) (at 10 m but preferably on the lowest predictive model level)
- Surface downward shortwave radiation (N4)
- Surface downward longwave radiation (AD)
Like GEM it also need the following lower boundary conditions:
- Sea surface temperature in Kelvin (TM)
- Sea ice fraction (LG)
Other than GEM, SPS does not offer the possibility to write and read restart files. Therefore, at the end of each job all fields needed to start another simulation need to get outputted. The next job will then use the output of the previous job as initial conditions. See below under 'outcfg.out' which fields these are.
SPS 6.1.1 uses the same physics version as GEM 5.1.1.
I suggest you create yourself a directory under your home for your source code, scripts and config files, for example:
mkdir -p ~/SPS/v_6.1.1
cd ~/SPS/v_6.1.1
mkdir Abs Configs Scripts
Scripts
The name of the scripts directory:
~/SPS/v_6.1.1/Scripts
is mandatory!
In the above directory you need to have the following scripts:
SPS_launch
SPS_prepare_job.sh
SPS_link_pilots.sh
set_RUNMOD.sh
r.run_in_parallel
sps.sh
o.set_array.dot
task_setup.dot
task_setup.py
task_setup_cachegen
You can copy the scripts from the following directory:
Executable
The following is a recipe to create your own executable:
a) Create directory for executable
To create an executable first create yourself a directory under your home for the source code; preferably under ~/SPS/v_6.1.1/Abs and change into it, for example:
cd ~/SPS/v_6.1.1/Abs/My_Abs
b) Get the basename of abs-directory
Get the basename (last part of your directory path) of your abs-directory for later use:
'base_dir' should now be set to 'My_Abs'. You can verify it with:
echo $base_dir
c) Clone the SPS Git repository you want to start from, for example:
Original version from RPN:
git clone ~winger/SPS/v_6.1.1/Abs/SPS_UQAM_development/sps
Modified UQAM version including FLake & CLASS:
git clone ~winger/SPS/v_6.1.1/Abs/SPS_UQAM_development_FLake_CLASS/sps
Then change into 'sps' directory that came with the clone:
d) Create working directories
So the object files, libraries and executables will not be under your home, create links for the directories 'build' and 'work'. Pick a place under which you want to keep these file under your data space, something like:
Create the directories and the links (using ${base_dir} from above) :
mkdir -p ${work_space}/${base_dir}/work
ln -s ${work_space}/${base_dir}/build
ln -s ${work_space}/${base_dir}/work
e) Acquire a compiler
Only needed once per window/terminal. Same as for GEM 5.1.1.
The alias 'sps611' should work as well. But better only use it once per window because each time it is used it will add a directory to your $PATH.
f) Set the variable 'abs_dir'
Set the variable 'abs_dir' to the name of your executable directory, including the 'sps' directory (needed under h))
g) Change into 'build' directory:
h) Create make files and dependencies
Only need once at the beginning and whenever you add any new routines or add include files or module "use" lines to an existing routine.
i) Compile and create executable
You can also use n cores in parallel by adding the '-jn', for example to use 4 cores:
make -j4 mainspsdm
Config files
You need the following config files:
configexp.cfg
sps.cfg
outcfg.out
dyn_input_table
physics_input_table
CLASS_input_table # Only needed when running CLASS
rpnstd_units.dict
rpnstd_units.rdict
sps.dict
configexp.cfg
Example file with variable explanations:
SPS_walltime=86400
# Topology: MPI(x)xMPI(y)xOpenMP; same as in GEM but other than for GEM, OpenMPI is a little more efficient.
# SPS_topo=1x1x1
SPS_topo=4x2x2
# Experiment name; other than in GEM the YYYYMM will get added to the full name. So no need for the _YYYYMM.
SPS_exp=SPS_GLHBQC_0.0225d_1350x1080
# Base directory of executable; same as in GEM (but different executable)
SPS_bin=/home/winger/SPS/v_6.1.1/Abs/SPS_UQAM_development_FLake_CLASS/sps
# Start date of simulation; same as in GEM
SPS_startdate="2008 01 01 00"
# End date of simulation; same as in GEM
SPS_enddate="2008 02 02 00"
# Number of months to run per job; max 12 (recommended for speed); same as in GEM
SPS_interval=12
# Ozone climatology; same file as for GEM
SPS_CONST=${AFSISIO}/datafiles/constants/ozoclim_phy45
# General climatology file; same file as for GEM
SPS_climato=${MODEL_DATA}/Climatology/MHEEP/clim_gemdm320_1080x540_v2_de_francois_withiceline_i8_glorys1v1.0_corr_EMIB
# Geophysical field file directory; almost same file as for GEM but
# - 'ME' needs to be renamed to 'MF' and
# - 'ME' of the driving data should be added and called 'MFBR'
SPS_geophy=${MODEL_DATA}/Example_data/SPS_example_data/Geophys_GLHBQC_0.0225d_1350x1080_USGS
# Initial condition file; similar file as for GEM: instead of atmospheric fields SPS upper driving data are needed
SPS_anal=${MODEL_DATA}/Example_data/SPS_example_data/anal_NAM11_CLASS_20080101
# Directory containing upper SPS driving data. There needs to be one file per month
SPS_pil_dir=${MODEL_DATA}/Example_data/SPS_example_data/INREP
# Directory containing SST & SIC, same as for GEM
SPS_SST_dir=${MODEL_DATA}/SST_SeaIce_degK/ERA5
# Temporary base input directory; same as for GEM
SPS_inrep=/dev/shm/${USER}/SPS_inrep
# Temporary base output directory; same as for GEM
SPS_outrep=/BIG3/winger/EXECDIR/stage
# Final archive directory
SPS_archdir=/BIG3/winger/SPS/Output/${SPS_exp}
outcfg.out
SPS does not have the possibility to write and continue from restart files. Therefore, at the end of each job all fields needed to start another simulation need to get outputted. The next job will then use the output of the previous job as initial conditions. To make the output of fields at the end of a job easier, steps=1 and steps=2 are now hard coded to:
steps=2 will automatically get set to the last timestep of the current job.
ISBA
The following is a list of mandatory initial condition fields for ISBA:
SD : VD=snow depth
DN : VD=relative snow density
I0 : VD=surface and soil temperatures
I1 : VD=soil volumetric water contents
I2 : VD=soil volumetric ice contents
I3 : VD=water retained on the vegetation
I4 : VD=water in the snow pack
I5 : VD=snow mass
I6 : VD=albedo of snow I7 : VD=sea ice temperature
I8 : VD=sea ice thickness
I9 : VD=glacier temperature
FLake
The following is a list of mandatory initial condition fields for FLake:
CLASS
When running CLASS, in addition to the ISBA fields above one also needs to output the following fields:
sps.cfg
Driving with fields on staggered lowest predictive model levels
1) Lvl_typ_S = 'HS' (HS= Hybride Staggered.. HU= Hybride Uniforme (toutes les variables sur le meme niveau)
2) Lvl_list = MOMENTUM LEVEL ONLY - SPS will deduce the thermo level by itself
3) ip1a=-1
ip1at=-1
For (3) , By using =-1, you let SPS figure out the ip1 using the standard encoding. (*Note that I don’t know what the “standard” is, I’m under the impression the ip1 encoding has changed with various GEM versions but perhaps it remained constant for HS levels).
If the ip1 encoding is not standard, you can override the ip1s using these 2 keys. In all cases, standard or not, you can choose to specify these.
The actual values for the lowest prognostic level + ip1s will depend on the forcing model configuration. It will vary from one GEM system to another.
Example:
Lvl_typ_S = 'HS'
Lvl_ptop_8 = 1000.0
Lvl_list = 0.997502
Lvl_rcoef = 3., 15.
@sps_cfgs
ip1a = -1
Driving with fields on diagnostic levels
Example:
Lvl_typ_S = 'HU'
Lvl_ptop_8 = 1000.0
Lvl_list = -1
@sps_cfgs
zta = 2.0
zua = 10.0
ip1a = 93423264
Launching SPS simulation
1) First you need to set the SPS environment
You can do this with the alias:
But better use it only once per window(!) because each time this alias is executed it will add a directory to your $PATH.
2) Go into the directory containing your config files:
3) Launch the simulation
To launch the simulation you just have to execute the command:
4) Check progress
You can check the progress of an SPS simulation the same way as for GEM by executing 'u.tail' on the model listing under ~/listings/${TRUE_HOST}:
Note that model listings will not - yet - get archived at the and of a job as done in GEM!
Restart / Continue a simulation
The easiest way to continue a simulation is probably by using 'SPS_lance' and specifying the start of the job. For example:
Where "YYYY MM DD hh" is the start date and hour of the current job.
Or, if the executable crashed and the input directory structures are still there you can just resubmit the model job. You can find the command to resubmit it at the end of the listing of the previous job. Search for 'SPS_model_job.sh'. You should find something like:
soumet /home/winger/SPS/v_6.1.1/Configs/Test/GLHBQC_0.0225d_135x108_monthy/SPS_model_job.sh -listing /home/winger/listings/alea -jn SPS_GLHBQC_0.0225d_monthly_v05_135x108_200901-200902 -t 864000 -cpus 4x4x1 -mpi