Earthquakes magnitudes

From WikiEducator
Jump to: navigation, search

How Are Earthquake Magnitudes Measured?
The magnitude of most earthquakes is measured on the Richter scale, invented by Charles F. Richter in 1934. The Richter magnitude is calculated from the amplitude of the largest seismic wave recorded for the earthquake, no matter what type of wave was the strongest.

The Richter magnitudes are based on a logarithmic scale (base 10). What this means is that for each whole number you go up on the Richter scale, the amplitude of the ground motion recorded by a seismograph goes up ten times. Using this scale, a magnitude 5 earthquake would result in ten times the level of ground shaking as a magnitude 4 earthquake (and 32 times as much energy would be released). To give you an idea how these numbers can add up, think of it in terms of the energy released by explosives: a magnitude 1 seismic wave releases as much energy as blowing up 6 ounces of TNT. A magnitude 8 earthquake releases as much energy as detonating 6 million tons of TNT. Pretty impressive, huh? Fortunately, most of the earthquakes that occur each year are magnitude 2.5 or less, too small to be felt by most people.