Real-time human-centered assistance in industrial processes depends on the individual history of the work person's activities in the work system and requires adequate methods for tracking the person's actions. Most research in human activity recognition is based on recognizing actions from video data using computer vision methods. Digital equipment, standardized machine data interfaces, and smart wearable devices extend the possibilities to describe the current state of the work system. Petri nets have already been applied to human activity recognition, however, without the requirement of detecting actions in real-time. This paper proposes a Petri net architecture that enables hierarchical description-based human activity recognition in industrial work processes. We present an extension, a Partitioned Colored Petri Net, based on the colored Petri net formalism that infers activities from state transitions of the work system in real-time. In a case study, we demonstrate the Petri net's application for an error-based learning system that visualizes error consequences in augmented reality using experimentable digital twins.