Abstract

The primary objective of this paper is to revisit a number of empirical modelling activities which are often characterized as data mining, in an attempt to distinguish between the problematic and the non-problematic cases. The key for this distinction is provided by the notion of error-statistical severity. It is argued that many unwarranted data mining activities often arise because of inherent weaknesses in the Traditional Textbook (TT) methodology. Using the Probabilistic Reduction (PR) approach to empirical modelling, it is argued that the unwarranted cases of data mining can often be avoided by dealing directly with the weaknesses of the TT approach. Moreover, certain empirical modelling activities, such as diagnostic testing and data snooping, constitute legitimate procedures in the context of the PR approach.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call