We formally prove the equivalence between Assembly Theory (AT) and Shannon Entropy via a method based upon the principles of statistical compression that belongs to the LZ family of popular compression algorithms. Such popular lossless compression algorithms behind file formats such as ZIP and PNG have been shown to empirically reproduce the results that AT considers its cornerstone. The same results have also been reported before AT in successful application of other complexity measures in the areas covered by AT such as separating organic from non-organic molecules and in the context of the study of selection and evolution. We demonstrate that the assembly index is equivalent to the size of a minimal context-free grammar. The statistical compressibility of such a method is bounded by Shannon Entropy and other equivalent traditional LZ compression schemes, such as LZ77 and LZW. We also demonstrate that AT, and the algorithms supporting its pathway complexity, assembly index, and assembly number, define compression schemes and methods that are subsumed into algorithmic information theory. We conclude that the assembly index and the assembly number do not lead to an explanation or quantification of biases in generative (physical or biological) processes, including those brought about by (abiotic or biotic) selection and evolution, that could not have been arrived at using Shannon Entropy, or that have not been already reported before using classical information theory or algorithmic complexity.