Modeling performed by the UW Tsunami Modeling Group, http://depts.washington.edu/ptha/index.html
In the description of files and directories below, the string LOC
is sometimes used to refer to one of these locations.
This README and all of the code mentioned below is archived in the tar file 2021_SJdF_L1_XL1_AKmaxWA_code.tar.gz
available by request from Washington Geological Survey.
The input_file, topofiles, and dtopofiles directories are archived in the tar file 2021_SJdF_L1_XL1_AKmaxWA_data.tar.gz
available by request from Washington Geological Survey, or these large data files can be obtained by other means described below.
The GeoClaw simulation results from running these codes are archived in the tar file 2021_SJdF_L1_XL1_AKmaxWA_results.tar.gz
available by request from Washington Geological Survey.
topo
: scripts and notebooks to manipulate DEMs and topo filestopo/topofiles
: topofiles to use in job runsdtopo
: scripts and notebooks to manipulate dtopo filesdtopo/dtopofiles
: dtopofiles to use in job runsinfo
: tide gauge locations, other general infocommon_python
: Common Python code for this projectcommon_fortran
: Common Fortran code for this projectRuns
: job runs for each location / eventWithin Runs
are directories for each fgmax location, and within each of these there are subdirectories for each event: CSZ_XL1
, CSZ_L1
, and AK
.
The topo/topofiles
directory contains topo files and also .kml
and .png
files that show the areas covered by each.
Note: These large files are not included in the tar file but can be found at http://depts.washington.edu/ptha/topo/, along with the scripts used to create some of them.
topo/topofiles/etopo1_-163_-122_38_63.asc
topo/topofiles/NWOuterCoast_2s_mhw_lapush.asc
topo/topofiles/SJdF_2sec_new.asc
topo/topofiles/PT_2sec_new.asc
topo/topofiles/SJdF13_crop_new.asc
topo/topofiles/PT13_crop_new.asc
The bash script topo/topofiles/wget_topofiles.sh
can be used to download them all.
The following notebooks are also included:
topo/make_topo_files.ipynb
Used old 1/3" DEMs when first testing, before new DEMs were available. Not needed but include for reference.
topo/merge_SJdF_DEMs.ipynb
Used to coarsen and merge the new 1/9" DEMs to create the new 1/3" and 2" SJdF and PT topofiles listed above. The old SJdF and PT 1/3" DEMs were used in regions not covered by the new DEMs.
topo/DiscoBay_topo.ipynb
Used to explore topography near the terminus of Discovery Bay. Not needed but include for reference.
topo/make_input_files_LOC.ipynb
One for each fgmax region LOC
. These notebooks create some of the input files described below. Rendered html versions are also included that show the expected output and some plots.
The dtopo/dtopofiles
directory contains dtopo files used to specify the sea floor displacement that generates the tsunami.
Note: These large files are not included in the tar file but can be found at http://depts.washington.edu/ptha/dtopo/, along with the scripts used to create some of them.
dtopo/dtopofiles/CSZ_L1-extended-pmel.tt3
dtopo/dtopofiles/CSZ_XL1-extended-pmel.tt3
dtopo/dtopofiles/AKmaxWA.tt3
The bash script dtopo/dtopofiles/wget_dtopofiles.sh
can be used to download them all.
Other large input files needed for GeoClaw runs are found in topo/input_files
.
Note: These large files are not included in the tar file but can be found in 2021_SJdF_L1_XL1_AKmaxWA_data.tar.gz
, or generated with the Python code that is included.
The fgmax points for each region are specified by the files
fgmax_pts_LOC.data
(One for each fgmax region LOC
)Ruled rectangles used in guiding adaptive mesh refinement are also found in topo/input_files
, in these files for indicating refinement to 1/3" around fgmax regions:
RuledRectangle_LOC.data
(One for each fgmax region LOC
)Additional ruled rectangles are used for indicating refinement to a variable level in a larger area around each fgmax region, to at least 6" resolution but allowing 2" resolution as the wave approached:
RuledRectangle_6sec_2sec_LOC.data
(One for each fgmax region LOC
)The input_files
directory also contains:
netCDF files containing the arrays indicating the fgmax_pts
and DEM values. After the code is run, this file is used as the basis for a netCDF file that also contains the model results:
input_LOC.nc
(One for each fgmax region LOC
)kmz files containing png images showing the fgmax points selected, separately for onshore and offshore points. These files are not needed for the GeoClaw simulations but are useful for visualizing the topography and fgmax points on Google Earth:
fgmax_kmlfiles_LOC/fgmax_topo_LOC.kmz
(One for each fgmax region LOC
)These files are created by the notebooks
topo/make_input_files_LOC.ipynb
(One for each fgmax region LOC
)Several figures (and png files) are also created. These are made in the topo
directory and then moved to topo/input_files
for safe keeping.
Rendered .html
versions of the notebooks are also included showing the expected output.
A directory for an fgmax region (location) has a name like Runs/Chito_Sekiu
. This directory contains a subdirectory for each event, such as CSZ_L1
.
Within an event directory, such as Runs/Chito_Sekiu/CSZ_L1
, there are the following files:
Makefile
: Note that EXE
might be set to $(WA_EMD_2021_SHARED)/xgeoclaw
so that a common executable can be used without recompiling in each event directory. Make sure you have this file; make .exe
should create it if necessary. Make sure $CLAW
is pointing to Clawpack Version 5.8.0 to reproduce the environment used for these runs.
setrun.py
: This generally does not need to change, unless you want to modify how often checkpoint files are created, or something else not set in params.py
.
params.py
: Most of the parameters that change from one location or event to another are collected in params.py
. This is read in by setrun
and also by setplot
and by the process_fgmax
notebook or script. If you change something in params.py
, remake the data via make data
before running the code again. Or use the make run
to do make data
followed by make output
.
Note: In addition to event directories for each earthquake source, there is also a directory make_B0
for each location. This is set up to do a short run with no source in order to save values of the topography B0
before co-seismic subsidence at each fgmax point, which is used in post-processing the fgmax results and producing plots and .nc
files. The results are processed by topo/make_B0.py
and stored in topo/input_files/LOC_B0.txt
for each fgmax region LOC
.
If copying these files to a new directory to create a different event at this location, the following parameters in params.py
are particularly important to modify:
loc
and event
should be set for this location / event.
fgmax_extent
is used in setplot
for zoomed views.
restart
: set to True
to do a restart and set restart_file
to point to the desired file to start from. Note that the files _output/fort.tck*
give the times of each checkpoint file, while the data needed to restart is in the corresponding _output/fort.chk*
file.
tfinal
and num_output_times
controls when the job ends (in seconds) and how many time frame files are produced along the way. If doing a restart, adjust these to the new (later) final time and desired total number of output times (including those time frames already generated).
lower
and upper
arrays specify the computational domain. Note that these are shifted left 1/6 arcsecond from values that should be chosen as multiples of 0.01 degree in order to insure that the finest computational grids at 1/3 arcsecond have cell centers aligned with the DEM grid points.
num_cells
sets the number of grid cells at the coarsest level (Level 1) over the computational domain. Usually far field events (like Alaska) have a larger computational domain and coarser Level 1 grid than nearfield events.
amr_levels_max
and refinement_ratios
controls the maximum number of AMR levels and the refinement factor from each level to the next.
topofiles
is a list of the topo files used for this run.
dtopofiles
is a list containing a single entry for the dtopo file used for this run (the earthquake source). Should be consistent with how event
is set and the name of the directory you are working in.
regions
is a list of AMR flagging regions in the old style. For this project this is the empty list, all regions were specified with the new flagregions
style.
flagregions
is a list of AMR flagging regions in the new style, i.e. each is an object of class clawpack.amrclaw.data.FlagRegion
. Note for flagregions with flagregion.spatial_region_type = 2
, flagregion.spatial_region_file
points to a file that specifies a RuledRectangle
that must be created to cover the spatial region desired.
wave_tolerance
: The tolerance used to determine whether to flag a cell for refinement, if it is not forced to be refined or restricted from further refinement by one of the regions or flagregions. A cell is flagged for refinement if abs(eta) > wave_tolerance
, where eta = h + B
is the surface elevation computed from water depth h
and geoclaw topography B
. This may need to be adjusted, e.g. smaller if the tsunami is small in the region of interest.
fgmax_grids
: A list of objects (just one for each run) of class clawpack.geoclaw.fgmax_tools.FGmaxGrid
describing the fgmax parameters for this run.
fg.xy_fname
should be set to the absolute path of a file in the form of a topotype 3 topography file, with values 1/0 indicating whether a point on this rectangular grid is an fgmax point or not. This file is created by the make_input_files
notebook or script and is assumed to be in topo/input_files
.Other fgmax parameters may need to be adjusted:
fg.min_level_check
should generally be set to the finest level. The values at fgmax points are only monitored on this level (or finer).
fg.tstart_max
: the time to start monitoring fgmax values, which should generally be a bit after the finest level is first created in the fgmax region (e.g. set to t_start + 20
).
Note that cells are flagged for refinement at the next regrid time after t_start
, but this may not be exactly at time t_start
.
fg.dt_check
: How often to update the running maxima stored at the fgmax points. This was set to 3 seconds.
gauges
: A list of gauges to monitor, each in the form of a list [gaugeno, x, y, t1, t2]
. Gauges are read in from info/tide_gauge_locations.csv
, and those found to lie in the region specified by fgmax_extent
are automatically included.
variable_sea_level
: Set to True
if the earthquake deformation affects the study location (i.e. for the CSZ events but not for the AKmaxWA event). This controls how the finest grids are initialized, filling with water only up to sea level as adjusted by the dtopo file.
force_dry_file_used
: Set to False
for all runs in this project, since there were no areas with land below MHW but that should be initialized as dry because they are behind dikes or levies.
No custom Fortran code was used for this project, only the standard Clawpack v5.8.0 code.
Some other parameters in params.py
are only used in post-processing, see comments in these files in each Runs/LOC/EVENT
directory.
Other postprocessing scripts are found in the directory common_python
.
After running GeoClaw in each directory Runs/LOC/EVENT
, the results are in Runs/LOC/EVENT/_output
by default. This directory contains time frame results at the times listed in the fort.t*
files, and also the fgmax results, in the file:
_output/fgmax0001.txt
This file lists only the fgmax points, one per line. (Note this is a new format from that used in GeoClaw before v5.7.0.)
The files can be post-processed using the following files, which are the same for all locations/events and found in common_python
:
setplot_common.py
: For making plots of the time frames after running the code via make plots
(or interactively with Iplotclaw
.
process_fgmax.py
: After running the code, this reads in the fgmax results (from the specified outdir
) and produces plots of the maximum flow depth and speed. It also creates a netCDF file with all of the results as arrays on the topo grid used in creating the input netCDF file described above (copying e.g. the input_Chito_Sekiu.nc
file and adding the output). The resulting file has a name like Chito_Sekiu_results.nc
.
process_fgmax
also creates kml files for displaying the fgmax results on Google Earth.
The files created by process_fgmax
(if save_figs == True
) will be copied into _plots/fgmax_plots/
so that they will be linked properly from _plots/_PlotIndex.html
if you also run make plots
to create the time frame plots and index in _plots
.
You can make plots
either before or after process_fgmax
is run.
process_gauges.py
produces multiple plots from each gauge result and also re-samples from the original output times to equally spaced times (5 second increments), storing the results both as a netCDF file and a csv file. It also creates a tar file of all the resulting files.
nc_tools.py
contains functions for generating the .nc files for the fgmax results.
fix_gauge_restart_problems.py
was used to eliminate overlapping times when a run executed due to time limits and had to be restarted. In this case the gauge output was repeated from the last checkpoint time to the time the original run failed.
collect_gauges.py
, collect_plots.py
, collect_results.py
These scripts were used to collect results from all loc/event directories for file transfer and archiving.
Some of the computations were performed using computing time provided by the CU-CSDMS High-Performance Computing Cluster. The directory cu_cluster
contains slurm scripts for compiling, running the code, making plots, and executing the post-processing scripts.
modify Makefile to set SCRATCH properly
submit job to run geoclaw:
`sbatch path-to/cu_cluster/run_geoclaw24.sbatch # using 24 threads`
If doing a restart, use full path to scratch directory for checkpt file.
_plots
should now exist with time frame plots, along with the results of the post-processing.
If not, one of the other scripts can be used:
cu_cluster/make_plots.sbatch
cu_cluster/process_fgmax.sbatch
cu_cluster/process_gauges.sbatch
We used a bash script like
cu_cluster/rsync_plots_to_homer.sh
executed from the event directory (after fixing the path and identifier) in order to transfer plots and output files to the UW web server for displaying plots and facilitating downloading of output files.