Tool condition monitoring has emerged as an essential approach in the machining industry to drive cost reductions and enhance productivity. In the era of Industry 4.0, tool wear monitoring can be done using various Internet of Things sensors, such as those recording acoustic emission, vibration, temperature, and pressure. Signals acquired can be processed by extracting numeric features or converting them into images.In this study, experimental milling campaigns with different input machine parameters were conducted, leading to the creation of a dataset based on tool wear measurements and acoustic emission registrations. Then we investigated a deep convolutional neural networks approach to classify image representations of the registered acoustic signals, aiming to predict the status of tool wear. Since the dataset was imbalanced, a class weights approach was used to improve model performance. An accuracy exceeding 90% was reached, demonstrating a great potential of the model in predicting tool wear state.