Concurrency control is a crucial aspect of multi-threaded application development, directly impacting performance and resource utilization.This research project investigates the impact of various concurrency control mechanisms on the performance of multi-threaded applications, specifically focusing on mutexes, semaphores, and lock-free data structures implemented using queues. Conducted in a Python environment within Google Colab, the study aims to assess performance metrics such as execution time, cpu usage, and memory usage. Each mechanism’s performance was evaluated through repeated trials, and the results were aggregated and averaged to ensure reliability. By systematically evaluating these concurrency control mechanisms, the project seeks to understand how each approach affects the efficiency of applications in a multi-threaded context. The findings reveal that the queue mechanism provides the fastest execution time, the semaphore mechanism, on the other hand, exhibited the lowest CPU usage, and the mutex mechanism offered a balanced performance in both speed and CPU efficiency. Memory usage was similar in mutex and semaphore, but queue differed significantly from the other two. These results show that the choice of concurrency control mechanism should be guided by the specific requirements of the application, considering factors such as workload nature and system architecture.
Read full abstract