Abstract

Under volatile market conditions, manufacturing companies, which achieve a high adherence to promised delivery dates, possess a considerable advantage over their direct competitors. However, due to dynamically changing production circumstances, this logistical target is rather hard to reach. Excellent production planning and control (PPC) processes are a main prerequisite for managing inevitable turbulences which occur in every production system and which impede to follow through with detailed scheduled production plans. An often overlooked reason for deviations between the originally planned and actually realized production program is the inadequate data quality of master and transaction data. PPC processes such as detailed scheduling, production control and production monitoring rely heavily on a vast volume of data in order to update short-term production plans, derive conclusions for ad-hoc control interventions and monitor resources’ efficiency as well as production job statuses. Typical techniques for dealing with inadequate data quality in the business context involve implementing integrity constraints in databases and defining data quality processes for the whole organization. Evidently, these classic approaches are not successfully preventing manufacturing companies from dealing with inadequate data quality in PPC processes. This paper presents a new approach for mitigating the negative effects of deficient data quality in production control by adapting data mining (DM) algorithms in order to estimate probable values for typical data inconsistencies in data relevant for production control. The algorithms are tested on a real-world data set of a typical mid-sized manufacturing company and their capability of retrieving the correct values is quantified.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call