Abstract
A distance function between two random variables or vectors was proposed in 2003 in a Ph.D. dissertation. Initially called a probability metric, it is now known as "Łukaszyk-Karmowski metric" or LK-metric and has been successfully applied in various fields of science and technology. It does not satisfy the identity of indiscernible (Leibniz's law) axiom of the metric, the ontological axiom also invalidated by the ugly duckling theorem. This note addresses two false claims made in a preprint that LK-metric is the same as the mean absolute difference and that it is ill-defined. The fallacy of the first claim is straightforward: the mean absolute difference is defined solely for independent and identically distributed random variables, contrary to LK-metric. Thus, if one considers E|X-X|, then the random variable X must be independent of itself, which implies its degenerate probability distribution and E|X-X|=0. If X has a degenerate probability distribution, then Y, which is identically distributed as X, also has a degenerate probability distribution and E|X-X|=0=E|X-Y|, invalidating the second claim.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.