Abstract

Benford’s Law is an interesting and unexpected empirical phenomenon — that if we take a large list of number from real data, the first digits of these numbers follow a certain non-uniform distribution. This law is actively used in economics and finance to check that the data in financial reports are real — and not improperly modified by the reporting company. The first challenge is that the cheaters know about it, and make sure that their modified data satisfies Benford’s law. The second challenge related to this law is that lately, another application of this law has been discovered — namely, an application to deep learning, one of the most effective and most promising machine learning techniques. It turned out that the neurons’ weights obey this law only at the difficult-to-detect stage when the fitting is optimal – and when further attempts attempt to fit will lead to the undesirable over-fitting. In this paper, we provide a possible solution to both challenges: we show how to use this law to make financial cheating practically impossible, and we provide qualitative explanation for the effectiveness of Benford’s Law in machine learning.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call