Reconfigurable manufacturing system is considered as a promising next-generation manufacturing paradigm. However, limited equipment and complex product processes add additional coupled scheduling problems, including resource allocation, batch processing and worker cooperation. Meanwhile, dynamic events bring uncertainty. Traditional scheduling methods are difficult to obtain good solutions quickly. To this end, this paper proposes a multi-agent deep reinforcement learning (DRL) based method for dynamic reconfigurable shop scheduling problem considering batch processing and worker cooperation to minimize the total tardiness cost. Specifically, a dual-agent DRL-based scheduling framework is first designed. Then, a multi-agent DRL-based training algorithm is developed, where two high-quality end-to-end action spaces are designed using rule adjustment, and an estimated tardiness cost driven reward function is proposed for order-level scheduling problem. Moreover, a multi-resource allocation heuristics is designed for the reasonable assignment of equipment and workers, and a batch processing rule is designed to determine the action of manufacturing cell based on workshop state. Finally, a strategy is proposed for handling new order arrivals, equipment breakdown and job reworks. Experimental results on 140 instances show that the proposed method is superior to scheduling rules, genetic programming, and two popular DRL-based methods, and can effectively deal with various disturbance events. Furthermore, a real-world assembly and debugging workshop case is studied to show that the proposed method is applicable to solve the complex reconfigurable shop scheduling problems.