An earthquake that measures a 6.0 on the Richter scale would be how much stronger than one that measures as a 3.0?
And the answer: 1,000 times as strong.
Because of the logarithmic basis of the Richter scale, each whole number increase on the scale represents a tenfold increase in power. So an earthquake that measures at 6.0 on the Richter scale would be 10 times as strong as one that measures at 5.0. It would also be 100,000 times as strong as one that measures 1.0 on the scale.
The Richter scale, developed by (you guessed it) Charles Richter, has been the primary scale for measuring earthquakes since its creation in the 1930s. Unlike other previously-used earthquake scales, the Richter scale works with "magnitude" to determine the strength of any given earthquake. In this situation, magnitude reflects how much the Earth's crust shifts during an earthquake.
The magnitude of an earthquake is determined by the Richter's logarithmic code. The seismograph – the instrument that detects the earthquake – records the amplitude of waves, which translate into the energy released by the earthquake. In today's science, the Richter scale is often used alongside other measurement scales such as the Mercalli Scale to gather more information about the experience and severity of any given earthquake.
In any given year, millions of earthquakes occur that are too weak to be noticed or recorded. The National Earthquake Information Center estimates around 20,000 earthquakes per year are strong enough to be recorded, yet almost 80% of those quakes occur along the rim of the Pacific Ocean called the "Ring of Fire" (check out this past AskQOTD article to learn more about this fiery ring.)
Did you know?
An earthquake under the ocean can cause a tsunami in which waves of water travel outward in all directions. The waves can even reach speeds up to 600 miles per hour (the speed of a jet!).