We present a measure that represents the degree of deviation of the magnitude frequency distribution of earthquakes from the Gutenberg-Richter (GR) law. The magnitude frequency distribution of earthquakes has been considered to obey the GR law. Actual distributions, however, often deviate considerably from this law. Let n i , N, M 0 and b be the number of events with magnitude M i ∼ M i + ΔM, total number of earthquakes, lower limit of the magnitude and b value of the GR formula, respectively. Then the measure of deviation is given using Kullback-Leibler's mean information as C= ∑ i p(X i)ln[ p(X i) q(X i ] X i=M i−M 0 where p(X i) = n i N and q(X i) = b ln 10 exp (−b ln 10 X i) . Function q( X i ) denotes the GR distribution and thus this measure gives the degree of deviation of the magnitude distribution from the GR law. When the magnitude distribution of earthquakes fits the GR formula more closely, the ‘ C’ value becomes smaller and is zero if the distribution obeys the GR law exactly. Conversely, if the distribution deviates greatly from the GR law, the ‘ C’ index shows a large value. We have investigated the temporal variation of ‘ C’ values before and after large earthquakes and swarm earthquakes in Japan. Results show considerable variations of ‘ C’ values before and after large earthquakes. Prior to large earthquakes, ‘ C’ values become small, which indicates that magnitude distributions fit better to the GR law. The ‘ C’ indices show drastic change after mainshocks. During swarm earthquakes, the variation of ‘ C’ values is rather large, but ‘ C’ values are smaller than ordinary earthquakes. In some swarms the ‘ C’ indices reach a maximum at the time of the greatest activity. This measure is a useful tool with which to observe the characteristic properties of the seismic activity and may be used for detection of precursory phenomena of large earthquakes.