The Gutenberg-Richter law, relating earthquake frequency to magnitude via a power-law, remains a cornerstone of seismology. It captures the scaling of brittle failure from microfracturing to plate-boundary rupture. However, earthquake catalogs are inherently discrete, and this discreteness is often neglected in modeling and hazard estimation.
Here, we construct idealized earthquake populations as finite sets of discrete events drawn from a bounded power-law distribution, defined by upper and lower magnitude limits and a fixed total number of events. This formulation introduces a well-defined event density and direct control over seismic moment release, enabling a more physically grounded and reproducible basis for synthetic catalog generation and hazard analysis.
We extend this with a three-segment, piecewise power-law model constrained by a Bayesian prior on the largest observed earthquake. The steep upper segment, covering only Mw > 9 events, is interpreted as representing the largest possible ruptures—likely limited to subduction megathrusts. The meaning of the intermediate and lower segments remains open, potentially reflecting geometric constraints, energy saturation, or deeper tectonic structure.
Even in the absence of full physical interpretation, the segmented model offers a tighter constraint on recurrence rates, particularly for large, rare events. This has direct implications for seismic hazard modeling and estimates of maximum credible earthquake size, making the framework both conceptually and practically valuable.