COMPUTER PROGRAMS IN SEISMOLOGY - TUTORIALS
Tutorials
Basic Seismology
Synthetics
- Synthetics for a Spherical Earth
FEB 06, 2008!
- Using Synthetics to
Evaluate Station Performance MAR 12, 2008
- Use of hudson96 for
high-frequency teleseismic P-wave synthetics MAR 26, 2008
- Teleseismic and Regional Green's
Functions
NOV 19, 2008
- Comparison of Randall and
Herrmann Green's functions
- Synthetics for exploration November 1, 2011
- Surface-waves in laterally varying
media Corrected July 7, 2017 and updated scripts
- Further tests of Surface-waves in laterally varying
media - horizontal gradient NEW May 2, 2023
- Acoustic waves from source in
crust and atmosphere March 25, 2012
- Finite Fault Synthetics -
Teleseismic P and S October 30, 2012
- Static Deformation Finite
Fault August 7, 2013
- Unified finite fault
tutorial to produce teleseismic P, regional waveforms and
regional static deformation January 5, 2016
-
Strain and stress synthetics using modal summation and wavenumber integration NEW June 8, 2021 (updated August 31, 2021)
- Comparison of Green's functions from Cerveny and wavenumber integration codes
NEW May 2, 2023
Surface Waves
Receiver Functions
Earth Structure
Sources
Seismic Data and Instrumentation
Velocity Models
Simulations
The purpose of the following is to demonstrate how the tools of
Computer Programs in Seismology can be used for simulations of
observed data so that a better understanding of the problems
inherent in real data sets is appreciated.
Data Downloads
Introduction
The Computer Programs in Seismology is a documented package of well
tested algorithms for certain types of seismological study. However,
possession of the programs is just the first step for research.
Equally important is the use of these programs to prepare and
analyze data sets. For any researcher, this is difficult until one
becomes comfortable with data analysis.
The purpose of these Tutorials is to provide proven procedures for
data analysis in areas of common interest. Hopefully this list
will continue to add new topics.
Some of the Tutorials are tests with synthetic data which serve to
highlight the limitations of any analysis technique. These also
serve as a validation of the compiled programs on a given
computer. Other tutorials provide real data sets for
consideration.
Considerable use of made of SHELL scripts (get any modern book on
LINUX and read the section on the BASH shell). The BASH shell is
inherently a programming language that is the default shell on
CYGWIN, LINUX and MacOS-X. A simple example of the bash shell (with
comments in color) is the following.
Assume that we have a file named DOIT.
Also assume that this is an executable shell script, e.g.,
rbh> ls -l DOIT
-rwxr-xr-x 1 rbh rbh 3619 2007-01-04 08:16 DOIT [The x means that this is an executable]
Assume that the contents of this file are the following:
#!/bin/sh
#####
# A line starting with an initial # is a comment.
# I use comments a lot to document scripts so that I know what I did
# when I look at them later
#####
#####
# This script will get ray parameters for a given source depth and epicentral distance
# in degrees from the program udtdd
#####
#####
# define event depth as a shell variable - note no spaces are permitted
#####
EVDP=100
#####
# loop over epicentral distances
#####
for GCARC in 30 40 50 60 70 80 90
do
RAYP=`udtdd -GCARC ${GCARC} -EVDP ${EVDP}`
#####
# the desired ray parameter is placed in the shell variable RAYP
# we can use this value later. Note that the command syntax is -GCARC distance_degrees
# The ${GCARC} places the value from the for loop into the proper position
#####
#####
# now list the results using the echo command
#####
echo ${GCARC} ${EVDP} ${RAYP}
done
I can now run the script as follows
> DOIT
30 100 0.0787601843
40 100 0.0742743611
50 100 0.0685746446
60 100 0.0613518916
70 100 0.0545108728
80 100 0.0479682162
90 100 0.0420032926
Use SHELL scripts and put in comments.
Last changed December 31, 2017