You are off by about a factor of 1,000.
Each incremental increase in magnitude is 10^1.5 in power. The difference between 1994 Northridge and this one is 2.1, so roughly 10^3 difference in power.
You are off by about a factor of 1,000.
Each incremental increase in magnitude is 10^1.5 in power. The difference between 1994 Northridge and this one is 2.1, so roughly 10^3 difference in power.
I thought that it was a log10 scale, so each increment of 1 on the scale is a 10-fold power increase, not a 10^1.5-fold.
This is power vs. energy.
The Richter Scale is a logarithmic scale, based on shaking measurements (think of the old pencil-based seismograph!). Power. (10^1).
The Moment Magnitude Scale (the more modern/replacement of Richter Scale) is based on energy. Geological organisations reporting on an earthquake will usually show this as "M <number>" or "Mw <number>".
Richter works well for small-to-medium earthquakes, and it's not accurate for really large or distant earthquakes.
The energy released in an earthquake increases exponentially, not just linearly.
EDIT: The Moment Magnitude Scale is where the "10^1.5" figure is coming from. https://en.wikipedia.org/wiki/Moment_magnitude_scale
I agree that it is not intuitive.
AFAIK, it was done that way to maintain rough congruence with historical seismic magnitude scales, which were more qualitative in nature. Modern seismic scale systems are significantly more scientific and quantitative but you can kind of retcon the historical systems if you set the exponential strength scale to 10^1.5 in the modern systems.
It's a log10 scale measuring amplitude.