Abstract

AbstractIndustrial facilities collect large volumes of data, store them according to prescribed protocols, and then interpret them for process decision‐making. Several sources and types of error contaminate these data for various reasons, but especially because they come from unreliable or unpredictable instruments. Data (or signal) processing corrects measurement errors to improve fidelity. Here, we highlight decision‐making applications and signal processing methods. To fully appreciate the state‐of‐the‐art, we interviewed plant data experts and software developers in the pulp and paper industry to examine how they apply signal processing methods in the context of decision‐making, including the value of process data, how these data are used, and the major barriers that prevent plants from using data. Process experts clean data thoroughly with basic approaches compared to the advanced techniques available in the recent literature. The interviews demonstrate that decisions in industry are primarily based on steady‐state process operating data. Challenges and barriers that prevent the use of process data to their full potential relate to resource limitations (people, time, and money), an entrenched culture, and access to recent technology. In practice, experts consider, implicitly or explicitly, data that represent the process operating under steady‐state conditions. A plant model that represents steady‐state operations is easier to interpret, is presented in a form that is usable by plant operators, and in this way, better enables decision‐making.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call