Metrics, and increasingly altmetrics, are a pervasive aspect of academic life. A proliferation of digital tools available have seen greater emphasis on the quantification of the ‘performance’ of individual journals. Although metrics and altmetrics are justified in terms of increased accountability and transparency, there are significant inequities in the ways they are deployed. Key among these is the unsuitability of many popular metrics for assessing publications in the humanities and social sciences, as the data, algorithms and systems which support them cater to authorship and citation practices of the various science, technology, engineering and mathematics (STEM) disciplines. These issues are amplified for journals in the sociology of sport, which publish research by humanities and social science scholars whose work is quantified according to the standards of the health science departments in which they frequently work. In this discussion, we critically examine how common forms of metrics and altmetrics, including those produced by Web of Science, Scopus, Google Scholar, and Altmetric.com, are applied to available sociology of sport journals. We analyse and critique how different metric algorithms produce variable measures of performance for each of the journals in the field and reveal how other information available on these databases can augment our understanding of the sociology of sport publishing ecology. Far from advocating the value of metrics and altmetrics, our analysis is intended to arm scholars and journals with the information required to critically navigate the entanglement of metrics and altmetrics with neoliberalism, audit culture and digital technologies in universities.