Don’t Study Large Earthquakes with Mathematica’s EarthquakeData

Mathematica contains information on earthquakes that is accessible using the built-in EarthquakeData function. The function includes useful search approaches that allow users to select earthquakes by magnitude, depth, date, etc. The geographic selection tools are particularly useful – you can select earthquakes by geographic regions, polygons, distance from a reference location, etc.

Unfortunately, something is wrong with the data. The earthquake information are not what most earthquake seismologists would agree is accurate. I have not combed through their entire data set, but the Wolfram data seem to have a problem with at least earthquake magnitudes. Although magnitude is the most commonly quoted estimate of an earthquake size, it is not a quantity easy for nonspecialists to understand and to package for general use.

The development of earthquake magnitude scales has a complex history because observational seismology developed before the firm theoretical basis of faulting and earthquakes was established. Many magnitudes may be associated with a large earthquake and not all are of equal value or even accurately reflect the size of the earthquake. Seismologists continue to measure and use a suite of magnitudes because that’s how we can compare old and new earthquakes.

The data for large earthquakes returned by the EarthquakeData command are inaccurate. I emailed Wolfram several years ago to describe problems I saw on Wolfram Alpha and they responded, but serious, obvious problems remain.  This is not a big deal to me, I use Mathematica to analyze and display earthquake data often, but I never use Mathematica’s data. But two things concern me: (1) Do all the data included with Mathematica suffer from such substantial errors (I don’t think so, but I wouldn’t know the details); (2) I worry about students exploring earthquakes on their own who perform calculations with the Mathematica’s data and who then are disappointed and discouraged that the effort was wasted because the input was flawed.

In the following, I evaluate some of the EarthquakeData information for large earthquakes. Problems appear quickly.

A simple search for a list of great earthquakes

Great earthquakes are earthquakes with magnitudes greater than or equal to 8.0. Mathematica makes searching for earthquakes with a minimum magnitude convenient. The search results are a list earthquake entities that contain properties related to each earthquake in the list. Since we know much more about recent earthquakes than historical events, some properties for the older events may be missing.

Here is an example query to get a list of all great earthquakes in the Wolfram data base.

(* Select earthquakes with magnitudes greater than or equal to 8.0 *)
eqs = EarthquakeData[All, 8.0];
Print["Returned " <> ToString[Length[eqs]] <> " earthquake entities."]

Continue reading

Manipulating Elevation/Bathymetry Profiles with Mathematica

Introduction

One of the earliest posts that I made on this site described how to use Mathematica to create elevation/bathymetry profiles for the Earth, at a specified latitude. I use those profiles to have students in my 100-level non-science majors course on earthquakes and society become familiar with Earth’s surface topography and its relationship to plate tectonic features. We are now studying tsunami, and one of the activities that we complete is an estimate of the travel time from an earthquake in the eastern Atlantic to a set of cities along the east coast of the United States. The source is located near the assumed source region of the 1755 Lisbon Earthquake.

We compute the rough speed of the tsunami, \(c\) using the simple formula \(c = \sqrt{d \cdot g}\), where \(d\) is the mean depth of the water along the great-circle arc connecting the source region and the point of interest, and \(g\) is the acceleration of gravity (about \(10~m/s^2\)). In large classes I provide a map with depth of the ocean contoured and including great circle arcs connecting the source and locations of interest and students estimate the mean depth along the path. They generally do ok, but estimating the mean depth is a challenge (one worthwhile). In my smaller courses, which are taught in computer laboratories, we can do the analysis using Mathematica notebooks. Then students can plot the great circle arcs, look at the profile, estimate the mean depth, and then compute the mean depth.

The Code

Here’s the code listing to extract the elevation/bathymetry along an arc connecting points 0 and 1, specified by the latitude and longitudes (e.g. lat1,lon1)

(* city *)
lat0 = 40.66;
lon0 = -73.94;
(* eq *)
lat1 = 36;
lon1 = -10;
(* get the great-circle arc and the distance along the arc *)
gcarc = GeoPath[{{lat0, lon0}, {lat1, lon1}}, "GreatCircle"];
gcarcDistance = GeoDistance[{{lat0, lon0}, {lat1, lon1}}, UnitSystem -> "Metric"]
(**)
p0 = Framed[GeoGraphics[{Red, AbsoluteThickness[4], gcarc}, 
     GeoRange -> {{0, 70}, {-120, 20}},
     GeoProjection -> "CylindricalEquidistant", GeoZoomLevel -> 6, 
     GeoBackground -> "ReliefMap", ImageSize -> Full]];
(* extract the elevations along the great-circle arc *)
profile = GeoElevationData[gcarc, Automatic, "GeoPosition", GeoZoomLevel -> 4];
(* extract the pts = {lat,lon,elevation} from the results *)
pts = profile[[1]][[1]];
depths = #[[3]] & /@ pts;
distances =  QuantityMagnitude[GeoDistance[{pts[[1]][[1;;2]], #[[1;;2]]}, 
             UnitSystem -> "Metric"]] & /@ pts;
avg = Mean[depths];
pts = Transpose[{distances, depths}];
(* plot the elevations along the line and a dashed line for sea level *)
label =  "Mean depth along the path: " <> ToString[Round[-avg, 5]] <> " m";
(**)
p1 = ListPlot[pts, Joined -> True, 
     Frame -> {True, True, False, False}, Axes -> False, 
     Filling -> Bottom, ImageSize -> Full,
     PlotLabel -> Style[label, 18], AspectRatio -> 0.4, BaseStyle -> 14,
     Prolog -> {Blue, Dashed,
     Line[{{distances[[1]], 0}, {distances[[-1]], 0}}], Black, 
     Line[{{distances[[1]], avg}, {distances[[-1]], avg}}]}];
(**)
Column[{p0, p1}] // Print

The Output

The graphics that result are show below – the dashed blue line is sea-level, the dashed black line is the mean value. A minor goal of the computation is to have students see the relationship between the mean depth and the profile. Continue reading

Searching the USGS Earthquake Catalog with Mathematica

This note documents a simple in-class activity that I use in my GEOSC 109H earthquakes and society class. In class I describe the Gutenberg-Richter Relation that represents the distribution of earthquakes as a function of magnitude (a nice paper-based activity is available from SCEC). I show the students a plot from the GCMT catalog in class, in this activity, they use Mathematica to download data for a few randomly chosen years and see if the pattern I showed in class holds up.

The Mathematica notebook begins with a few utility functions that allow students to use web services to acquire seismicity data and a few functions to list some events (which is partly a check to see that things are working). Below is a listing of the search script (an example usage is provided below).

comcatRectangularSearchPath[start_, end_, minlat_, maxlat_, minlon_, maxlon_] := Module[{},
  "http://earthquake.usgs.gov/fdsnws/event/1/query?format=csv&starttime=" <> 
   DateString[start, "ISODateTime"] <> "&endtime=" <> 
   DateString[end, "ISODateTime"] <> 
   "&minmagnitude=4.0&minlatitude=" <> 
   ToString[Round[minlat, 0.001]] <>
   "&maxlatitude=" <> 
   ToString[Round[maxlat, 0.001]] <>
   "&minlongitude=" <> 
   ToString[Round[minlon, 0.001]] <> "&maxlongitude=" <> 
   ToString[Round[maxlon, 0.001]] <> "&orderby=time-asc"
  ]

comcatRectangularSearch[start_, end_, minlat_: - 90, maxlat_: 90, 
  minlon_: - 180, maxlon_: 180] := Module[{path},
   path = comcatRectangularSearchPath[start, end, minlat, maxlat, minlon, maxlon];
  {path, Import[path, "CSV"]}
  ]

The scripts are simplistic, and to avoid errors for some of the more active years, I limit the magnitude range of the analysis to events greater than or equal to magnitude 4.0. The code below is a function to list the largest events in the search results.

Continue reading

A Day-of-Year Calculator with Python

Many seismic data are stored using a numerical day of the year in place of a month and day. This is efficient for the data storage, but it leads to the need to convert back and forth from command month-day to day-of-year values. Over the years, I have seen printed tables of values that some seismologists kept nearby to do the conversion back and forth. Now, of course, we work at computers much of the day and we generally have an interactive terminal available, so I created a simple python utility to perform the conversion.

Modern python has extensive packages to deal with time quantities, but I use a simple, lightweight package that I wrote to handle the conversions (back in the mid 90’s when python had many fewer packages and most everyone was using PERL).

The date to day-of-year conversion is relatively straight forward, so there’s not much code to write – here is what I use.

doy.py

!/usr/bin/python
import sys,string
import pytutil
#
if len(sys.argv) == 4:
	
	month = string.atoi(sys.argv[1])
	day   = string.atoi(sys.argv[2])
	year  = string.atoi(sys.argv[3])
	
	print "   %02d/%02d/%04d is the %03d of the year." \
		  % (month,day,year,cja_pytutil.doy(year,month,day))
elif len(sys.argv) == 3:
	doy   = string.atoi(sys.argv[1])
	year  = string.atoi(sys.argv[2])
	output = cja_pytutil.monthdate(year,doy)
	print "   Day %03d of %4d is %02d/%02d/%04d " \
		  % (doy,year,output[0],output[1],year,)
else:
	print " "
	print "  Usage: month day year"
	print "      or"
	print "  Usage: doy year"
	print " "	

Continue reading

Seismometer Responses with Mathematica

Seismometers, Poles, & Zeros

Although not an everyday calculation, seismologists often benefit from a reminder of the instrument response curves (amplitude responses). Seismometers are generally described as the ratio of two polynomials in the Laplace Transform variable, s. The polynomials are described by their roots, which we call zeros (for the numerator polynomial) and poles (for the denominator polynomial). Generally when and instrument is calibrated, the information on the calibration is communicated by the poles, zeros, and an overall gain. To compare instruments with different gains, we also often choose a normalization frequency for display.

In the following I present some Mathematica scripts that I use to examine the poles, zeros, and instrument response amplitude spectrum. Slight modifications can introduce the phase information as well, but to keep things a little short, I leave that out of the description. We’ll look at four sets of instruments, Trillium, Steckheisen, and Guralp broadband seismometers, and then some short-period instruments that are part of the IRIS Consortium instrument pool.

Nanometrics Trillium Sensor Responses

I obtained the poles and zeros and zeros for the instrument responses from the PASSCAL web site and normalized the responses to a frequency of 1 Hz.  Click here for the link.

Here is the code to set up three sets of poles and zeros as Mathematica lists – corresponding to the Trillium 120 PA, 240 (2nd Gen), and the Compact Trillium sensors.

(* 
  Poles and zeros for the Trillium 120PA from 
  http://www.passcal.nmt.edu/webfm_send/1970
*)
Zn120 = {0, 0, -90.0, -160.7, -3108};
Pn120 = {-0.03852 + 0.03658 I, -0.03852 - 0.03658 I, -178.,
         -135 + 160 I, -135 - 160 I, -671 + 1154 I, -671 - 1154 I};

(* 
  Poles and zeros for the Trillium 240 (2nd Gen) from 
  http://www.passcal.nmt.edu/webfm_send/1970 
*)
Zn240 = {0, 0, -90., -164.2, -3203.};
Pn240 = {-0.01813 + 0.01803 I, -0.01813 - 0.01803 I, -124.9 + 0 I,
         -197.5 + 256.1 I, -197.5 - 256.1 I, -569 + 1150 I, -569 + -1150 I};
(* 
  Poles and zeros for the Compact Trillium from 
  http://www.passcal.nmt.edu/webfm_send/1970 
*)
Znct = {0, 0, -434.1};
Pnct = {-0.036910 + 0.037120 I, -0.036910 + -0.037120 I, -371.2, -373.9 + 475.5 I,
   -373.9 - 475.5 I, -588. + 1508. I, -588.4 - 1508. I};

I can compute the three instrument responses using the following Mathematica functions. I could write one function and call it with the poles and zeros, but this is not that much extra work for an example. The second line in each function normalizes the response to unit amplitude at a frequency of one Hertz.

(* Instrument responses normalized at one Hz *)
Element[f, Reals];
s = I 2 \[Pi] f;
tr120resp[f_] = (Times @@ (s - Zn120)/Times @@ (s - Pn120)) / 
                (Abs[Times @@ (s - Zn120)/ Times @@ (s - Pn120)] /. f -> 1.0);

tr240resp[f_] = (Times @@ (s - Zn240)/Times @@ (s - Pn240)) / 
                (Abs[Times @@ (s - Zn240)/Times @@ (s - Pn240)] /. f -> 1.0);

ctresp[f_] = (Times @@ (s - Znct)/Times @@ (s - Pnct))/
             (Abs[Times @@ (s - Znct)/ Times @@ (s - Pnct)] /. f -> 1.0);

The variable \( s = i \omega \) is the Laplace Transform variable and has units of radians. I can display the instrument response amplitude spectrum using

p1 = LogLogPlot[{Abs[tr240resp[f]], Abs[tr120resp[f]], Abs[ctresp[f]]}, {f, 0.001, 1000 },
   Filling -> Axis, ImageSize -> 512,
   Axes -> False, Frame -> True,
   BaseStyle -> {14, FontFamily -> "Helvetica"},
   FrameLabel -> {"Frequency (Hz)", "Normalized Amplitude"},
   PlotLabel -> "Trillium Sensors" ,
   LabelStyle -> {14, FontFamily -> "Helvetica"},
   PlotLegends -> LineLegend[{"TR240", "TR120", "Compact"}]];
p1 // Print

Here is the plot.

Nanometrics Trillium seismometer response curves (amplitude).

Continue reading

Parsing USGS QuakeML files with Python

The USGS provides earthquake lists in a number of formats including kml for Google Earth, easy-to-read comma-separated-value (csv) files that are quite simple to parse with any spreadsheet, matlab, Mathematica, and many unix tools. Some more information is available in the JSON formatted files, which are the preferred methods for downloading the data feeds. The XML formatted data, stored in the community-standard quakeml dialect, contain the most information. XML is not a human-readable format, but is compatible with the database centric workflows often used in modern earthquake monitoring activities. Accessing the data directly from the databases makes it more likely to have fewer translation errors (or at least consistent data).

My goal in these notes is not to explain the QuakeML (qml) format (I couldn’t if I wanted). You can read about QML at https://quake.ethz.ch/quakeml/, it’s a complex beast of a data format that can be used to store many of the different types of information used in earthquake analyses. Information items range from seismic event hypocenters and origin times (which are collectively called origins) to individual seismic wave arrival time and amplitude measurements. QML contains many schema and allows much of the seismic data to be encoded as they are measured. For example an arrival time pick can be created and stored before the earthquake location associated with that arrival is estimated. The same is true for the seismic wave amplitudes. Storing the information as it is measured is more robust and more conducive for near-real time monitoring processing flows. This helps produce quick locations and magnitudes, which lead in turn to quick earthquake alerts.

At the analysis end of the flow, the information organization can be less convenient than many old formats. For example, information that a seismologist might like to list together may be split apart in theQML file. An arrival time pick and/or measured wave amplitude may be stored as part of the event object, but a travel time residual must be associated with an event origin object. And a single event may have multiple origins – in the simplest case, one origin could be the hypocenter location estimated from body-wave arrival times, another origin could be a rupture centroid estimated using a moment-tensor inversion of long-period waveforms. These files can be complicated. Another issue with USGS QML is that sometimes you get more information, other times you get less. For example, a recent earthquake file might Include information on the station, network, channel, and location code for the waveform used for a measurement, but a file corresponding to an event from a few years older may just provide the station ID. Changing content can have you pulling your hair out when trying to extract information.

QuakeML can also simply be confusing, look at this encoding of an arrival time pick

.
.
.
        <arrival publicID="quakeml:us.anss.org/arrival/200081v8/us_3395_bpid-2001275574">
          <pickID>quakeml:us.anss.org/pick/200081v8/us_3395_bpid-2001275574</pickID>
          <phase>Pn</phase>
          <azimuth>226.495</azimuth>
          <distance>13.8121</distance>
          <takeoffAngle>
            <value>71.6</value>
          </takeoffAngle>
          <timeResidual>1.30</timeResidual>
          <timeWeight>0.6300</timeWeight>
          <creationInfo>
            <agencyID>us</agencyID>
            <agencyURI>smi:anss.org/metadata/agencyid/us</agencyURI>
            <author>manual</author>
            <creationTime>2016-12-17T10:59:33Z</creationTime>
          </creationInfo>
        </arrival>
.
.
.

Look carefully and you will see that the azimuth, distance, and time items all are specific values, the takeoffAngle item has a sublevel to store a value. I don’t know why. Variability in data storage practice adds flexibility, but also makes codes hard to maintain.

Continue reading

Converting Between Moment Magnitude and Seismic Moment with Python

Introduction

The historical tradition of using magnitude to communicate the size of an earthquake is deeply entrenched both within seismology and throughout the public. So it makes sense to have a way of mapping seismic moment into a magnitude. Kanamori (1977) developed just such a conversion, which we call moment magnitude and represent as \(M_W\). The relationship was defined in CGS units as:

$$M_w=\frac{2}{3} \log _{10}\left(M_0\right)-10.7~,$$

where \(M_0\) is the moment in dyne-cm. To convert a moment from dyne-cm to N-m (newton meters), divide by \(10^7\). We can invert the equation to convert \(M_W\) to seismic moment:

$$M_0=10^{1.5 \left(M_w+10.7\right)}~.$$

Converting back and forth from the two measures is a common task, and long ago I wrote a simple command-line driven fortran code to perform the conversion. In this note I describe two python scripts that will do the job. Like the original Fortran, both use command-line arguments, and part of the reason for this note is to provide an example of using command-line arguments in python. I use the convenient argparse package to handle all the details.

Python Code

The first version uses two command line arguments -mw and -m0 to indicate whether you want to go from \(M_W\) to \(M_0\) or from \(M_0\) to \(M_W\). Here is the code:

#!/usr/local/anaconda/bin/python
#
#====================================================================
#   MAIN PROGRAM
#====================================================================
if __name__ == '__main__':
    #
    import argparse, math
    #
    parser = argparse.ArgumentParser( \
        description='Convert between seismic moment and moment magnitude.')
    #
    parser.add_argument('floats', metavar='Value', type=float, nargs='+',\
        help='A seismic moment in dyne cm or an Mw value.')
    #
    args = parser.parse_args()
    #
    for value in args.floats:
        if value &lt;= 12:
            print "Mw = %.2f, M0 = %9.3e dyne cm" % (value,math.pow(10,1.5*(value+10.7)))
        else:
            print "M0 = %9.3e dyne cm, Mw = %.2f" % (value,0.66667*math.log10(value)-10.7)
#

Continue reading

Quick Lists of Recent Large Earthquakes with Python

Introduction

I am often interested in a quick summary of recent global seismic activity. A number of good web sites and apps can provide such information, but I often want a simple, quick summary. I wrote the following python script to execute from a command line (as in the OS X Terminal App) and to provide basic information on recent large earthuake activity. I place the script in ~/Dropbox/bin and can run it from all my Macs. I have also installed it on my iPads using the Pythonista App.

The script does nothing fancy, it simply performs a few searches on the U.S. Geological Survey’s earthquake feeds and summarizes information in three categories – great (M ≥ 8) earthquake occurrence in the last three years, major and great (M ≥ 7) earthquake activity in the last year, and strong, major, and great (M ≥ 6) earthquakes in the last 30 days.

Continue reading