Running WRF-SFIRE with real data in the WRFx system
wrf-fire
A coupled weather-fire forecasting model built on top of Weather Research and Forecasting (WRF).
Requirements and environment
Install requiered libraries
- General requirements:
- C-shell
- Traditional UNIX utilities: zip, tar, make, etc.
- WRF-SFIRE requirements:
- Fortran and C compilers (Intel recomended)
- MPI libraries (same compiler)
- NetCDF libraries (same compiler)
- WPS requirements:
- zlib compression library (zlib)
- PNG reference library (libpng)
- JasPer compression library
- libtiff and geotiff libraries
See also https://www2.mmm.ucar.edu/wrf/users/prepare_for_compilation.html.
Set environtment
Set specific libraries installed
setenv NETCDF /path/to/netcdf setenv JASPERLIB /path/to/jasper/lib setenv JASPERINC /path/to/jasper/include setenv LIBTIFF /path/to/libtiff setenv GEOTIFF /path/to/libtiff setenv WRFIO_NCD_LARGE_FILE_SUPPORT 1
And add all the library folders into your LD_LIBRARY_PATH
setenv LD_LIBRARY_PATH /path/to/netcdf/lib:/path/to/jasper/lib:/path/to/libtiff/lib:/path/to/libtiff/lib:$LD_LIBRARY_PATH
Installation
Clone github repositories
Clone wrf-fire github repository
git clone https://github.com/openwfm/wrf-fire
Configure WRF-SFIRE
cd wrf-fire/wrfv2_fire ./configure
Options 7 (intel dmpar) and 1 (simple nesting) if available
Compile WRF-SFIRE
Compile em_real
./compile em_real >& compile_em_real.log & grep Error compile_em_real.log
If any compilation error, compile em_fire
./compile em_fire >& compile_em_fire.log & grep Error compile_em_fire.log
If any of the previous step fails:
./clean -a ./configure
Add to configure.wrf -nostdinc at the end of the CPP flag, and repeat compilation. If this does not solve compilation, look for issues in your environment.
Configure WPS
cd ../WPS ./configure
Option 2 (serial) if available
Compile WPS
./compile >& compile_wps.log & ls -l *.exe
It should contain geogrid.exe, metgrid.exe, and ungrib.exe. If not
./clean -a ./configure
Add to configure.wps -nostdinc at the end of CPP flag, and repeat compilation. If this does not solve compilation, look for issues in your environment.
Get static data
cd ../.. wget http://math.ucdenver.edu/~jmandel/tmp/WPS-GEOG.tbz tar xvfj WPS-GEOG.tbz
wrfxpy
WRF-SFIRE forecasting and data assimilation in python using an HPC environment.
Requirements and environment
Install Anaconda distribution
Download and install the Python 3 Anaconda Python distribution for your platform. We recommend an installation into the users' home directory.
wget https://repo.continuum.io/archive/Anaconda3-2019.10-Linux-x86_64.sh chmod +x Anaconda3-2019.10-Linux-x86_64.sh ./Anaconda3-2019.10-Linux-x86_64.sh
Install necessary packages
We recommend the creation of an environment. Install pre-requisites:
conda update -n base -c defaults conda conda create -n wrfxpy python=3 netcdf4 pyproj paramiko dill scikit-learn scikit-image h5py psutil proj4 pytz conda activate wrfxpy conda install -c conda-forge simplekml pygrib f90nml pyhdf xmltodict basemap pip install MesoPy python-cmr
Note that conda and pip are package managers available in the Anaconda Python distribution.
Set environment
If you created the wrfxpy environment as shown above
setenv PROJ_LIB "$HOME/anaconda3/envs/wrfxpy/share/proj"
If not
setenv PROJ_LIB "$HOME/anaconda3/share/proj"
Installation
Clone github repository
git clone https://github.com/openwfm/wrfxpy
General configuration
An etc/conf.json file must be created with the keys discussed below. A template file etc/conf.json.initial is provided as a starting point.
cd wrfxpy cp etc/conf.json.initial etc/conf.json
Configure the system directories, WPS/WRF-SFIRE locations, and workspace locations by editing the following keys in etc/conf.json:
"wps_install_path": "/path/to/WPS" "wrf_install_path": "/path/to/WRF" "sys_install_path": "/path/to/wrfxpy" "wps_geog_path" : "/path/to/WPS-GEOG" "wget" : /path/to/wget"
Note that all these paths are created from previous steps of this wiki unless wget path which needs to be specified. If not version of wget is prefered
which wget
Cluster configuration
Next, wrfxpy needs to know how jobs are submitted on your cluster. Create an entry for your cluster in etc/clusters.json, here we use speedy as an example:
{ "speedy" : { "qsub_cmd" : "qsub", "qdel_cmd" : "qdel", "qstat_cmd" : "qstat", "qstat_arg" : "", "qsub_delimiter" : ".", "qsub_job_num_index" : 0, "qsub_script" : "etc/qsub/speedy.sub" } }
And then the file etc/qsub/speedy.sub should contain a submission script template, that makes use of the following variables supplied by wrfxpy based on job configuration:
%(nodes)d the number of nodes requested %(ppn)d the number of processors per node requested %(wall_time_hrs)d the number of hours requested %(exec_path)d the path to the wrf.exe that should be executed %(cwd)d the job working directory %(task_id)d a task id that can be used to identify the job %(np)d the total number of processes requested, equals nodes x ppn
Note that not all keys need to be used, as shown in the speedy example:
#$ -S /bin/bash #$ -N %(task_id)s #$ -wd %(cwd)s #$ -l h_rt=%(wall_time_hrs)d:00:00 #$ -pe mpich %(np)d mpirun_rsh -np %(np)d -hostfile $TMPDIR/machines %(exec_path)s
The script template should be derived from a working submission script.
Note: wrfxpy has already configuration for colibri, gross, kingspeak, and cheyenne.
Tokens configuration
When running wrfxpy, sometimes the data needs to be accessed and downloaded using a specific token created for the user. For instance, in the case of running the Fuel Moisture Model, one needs a token from a valid MesoWest user to download data automatically. Also, when downloading satellite data, one needs a token from an Earthdata user. All of these can be specified with the creation of the file etc/tokens.json from the template etc/tokens.json.initial containing:
{ "mesowest" : "token-from-mesowest", "appkey" : "token-from-earthdata" }
So, if any of the previous capabilities are required, create a token from the specific page, do
cp etc/tokens.json.initial etc/tokens.json
and include your previously created token.
Test
Simple forecast
At this point, one should be able to run wrfxpy with a simple example:
conda activate wrfxpy ./simple_forecast.sh
Press enter at all the steps to set everything with the default value until the queuing system, that we select the cluster we configure (speedy in the example).
This will generate a job under jobs/experiment.json (or the name of the experiment that we chose).
Then, we can run our first forecast doing
./forecast.sh jobs/experiment.json >& logs/experiment.log &
Show generate the experiment in the path specified in the etc/conf.json file and under a folder using the experiment name. The file logs/experiment.log should show the whole process step by step without any error.
Possible errors
wrf.exe fails
Depending on the cluster, wrfxpy could fail when tries to execute ./wrf.exe. This happens on systems that do not allow executing MPI binary from the command line. We do not run real.exe by mpirun because mpirun on head node may not be allowed. Then, one needs to provide an installation of WRF-SFIRE in serial mode in order to run real.exe in serial. In that case, we want to repeat the previous steps but using the serial version of WRF-SFIRE
cd .. git clone https://github.com/openwfm/wrf-fire wrf-fire-serial cd wrf-fire-serial/wrfv2_fire ./configure
Options 5 (intel serial) and 0 (no nesting)
./compile em_real >& compile_em_real.log & grep Error compile_em_real.log
Again, if any of the previous step fails:
./clean -a ./configure
Add -nostdinc in CPP flag, and repeat compilation. If this does not solve compilation, look for issues in your environment.
Note: This time, we only need to compile em_real because we only need real.exe. However, if you want to test serial vs parallel for any reason, you can proceed to compile em_fire the same way.
Then, we need to add this path in etc/conf.json file in wrfxpy, so
cd ../wrfxpy
and add to etc/conf.json file key
"wrf_serial_install_path": "/path/to/WRF/serial"
This should solve the problem, if not check log files from previous compilations.
wrfxweb
Web-based visualization system for imagery generated by wrfxpy.
Account creation
Create ~/.ssh directory (if you have not one)
mkdir ~/.ssh cd ~/.ssh
Create an id_rsa key (if you have not one) doing
ssh-keygen
and following all the steps (you can select defaults, so always press enter).
Send an email to Jan Mandel (jan.mandel@gmail.com) asking for the creation of an account in demo server providing:
- Purpose of your request (including information about you)
- User id you would like (user_id)
- Short user id you would like (short_user_id)
- Public key (~/.ssh/id_rsa.pub file previously created)
After that, you will receive an answer from Jan and you will be able to ssh the demo server without any password (only the passcode from the id_rsa key if you set one).
Installation
Clone github repository
Clone wrfxweb in the demo server
ssh user_id@demo.openwfm.org git clone https://github.com/openwfm/wrfxweb.git
Configuration
Change directory and copy template to create new etc/conf.json
cd wrfxweb cp etc/conf.json.template etc/conf.json
Configure the following key in etc/conf.json:
"url_root": "http://demo.openwfm.org/short_user_id"
The next steps are going to be set in a desired installation of wrfxpy (generated in the previous section).
Configure the following keys in etc/conf.json in any wrfxpy installation
"shuttle_ssh_key": "/path/to/id_rsa" "shuttle_remote_user": "user_id" "shuttle_remote_host": "demo.openwfm.org" "shuttle_remote_root": "/path/to/remote/storage/directory" "shuttle_lock_path": "/tmp/short_user_id"
The "shuttle_remote_root" key is usually defined as "/home/user_id/wrfxweb/fdds/simulations". So, everything should be ready to send post-processing simulations into the visualization server.
Test
Finally, one can repeat the previous test but when simple forecast asks
Send variables to visualization server? [default=no]
you will answer yes.
Then, you should see your simulation post-processed time steps appearing in real-time on http://demo.openwfm.org under your short_user_id.
wrfxctrl
A website that enables users to submit jobs to the wrfxpy framework for fire simulation.
Requirements and environment
Installation
Clone github repository
Clone wrfxctrl in your cluster
git clone https://github.com/openwfm/wrfxctrl.git
Configuration
Change directory and copy template to create new etc/conf.json
cd wrfxctrl cp etc/conf-template.json etc/conf.json
Configure following keys in etc/conf.json
"host" : "127.1.2.3", "port" : "5050", "root" : "/short_user_id/", "wrfxweb_url" : "http://demo.openwfm.org/short_user_id/", "wrfxpy_path" : "/path/to/wrfxpy", "jobs_path" : "/path/to/jobs", "logs_path" : "/path/to/logs", "sims_path" : "/path/to/sims"
Notes:
- Entries "host" and "port" are only examples but, for security reasons, you should choose one different of your own.
- Entries "jobs_path", "logs_path", and "sims_path" are recommended to be removed. By default they are defined to be in wrfxctrl directories wrfxctrl/jobs, wrfxctrl/logs, and wrfxctrl/simulations.