SUMMARY Microseismic monitoring is an important technique to obtain detailed knowledge of in-situ fracture size and orientation during stimulation to maximize fluid flow throughout the rock volume and optimize production. Furthermore, considering that the frequency of earthquake magnitudes empirically follows a power law (i.e. Gutenberg–Richter), the accuracy of microseismic event magnitude distributions is potentially crucial for seismic risk management. In this study, we analyse microseismicity observed during four hydraulic fracture treatments of the legacy Cotton Valley experiment in 1997 at the Carthage gas field of East Texas, where fractures were activated at the base of the sand-shale Upper Cotton Valley formation. We perform waveform cross-correlation to detect similar event clusters, measure relative amplitude from aligned waveform pairs with a principal component analysis, then measure precise relative magnitudes. The new magnitudes significantly reduce the deviations between magnitude differences and relative amplitudes of event pairs. This subsequently reduces the magnitude differences between clusters located at different depths. Reduction in magnitude differences between clusters suggests that some attenuation-related biases could be effectively mitigated with relative magnitude measurements. The maximum likelihood method is applied to understand the magnitude frequency distributions and quantify the seismogenic index of the clusters. Statistical analyses with new magnitudes suggest that fractures that are more favourably oriented for shear failure have lower b-value and higher seismogenic index, suggesting higher potential for relatively larger earthquakes, rather than fractures subparallel to maximum horizontal principal stress orientation.