Quantum atomic sensors have shown great promise for vacuum metrology. Specifically, the density of gas particles in vacuum can be determined by measuring the collision rate between the particles and an ensemble of sensor atoms. This requires preparing the sensor atoms in a particular quantum state, observing the rate of changes of that state, and using the total collision rate coefficient for state-changing collisions to convert the rate into a corresponding density. The total collision rate coefficient can be known by various methods, including quantum scattering calculations using a computed interaction potential for the collision pair, measurements of the post-collision sensor-atom momentum recoil distribution, or empirical measurements of the collision rate at a known density. Observed discrepancies between the results of these methods call into question their accuracy. To investigate this, we study the ratio of collision rate measurements of co-located sensor atoms, 87Rb and 6Li, exposed to natural abundance versions of H2, He, N2, Ne, Ar, Kr, and Xe gases. This method does not require knowledge of the test gas density and is, therefore, free of the systematic errors inherent in efforts to introduce the test gas at a known density. Our results are systematically different at the level of 3% to 4% from recent theoretical and experiment measurements. This work demonstrates a model-free method for transferring the primacy of one atomic standard to another sensor atom and highlights the utility of sensor-atom cross-calibration experiments to check the validity of direct measurements and theoretical predictions.