The presence of weeds poses a common and persistent problem in crop cultivation, affecting both yield and overall agricultural productivity. Common solutions to the problem typically include chemical pesticides, mulching, or mechanical weeding performed by agricultural implements or humans. Even if effective, those techniques have several drawbacks, including soil and water pollution, high cost-effectiveness ratio or stress for operators. In recent years, novel robotic solutions have been proposed to overcome current limitations and to move towards more sustainable approaches to weeding. This work presents a mixed-autonomous, robotic, weeding system based on a fully integrated three-axis platform and a vision system mounted on a mobile rover. The rover’s motion is remotely controlled by a human operator, while weeds identification and removal is performed autonomously by the robotic system. Once in position, an RGB-D camera captures the portion of field to be treated. The acquired spatial, color and depth information is used to classify soil, the main crop, and the weeds to be removed using a pre-trained Deep Neural Network. Each target is then analyzed by a second RGB-D camera (mounted on the gripper) to confirm the correct classification before its removal. With the proposed approach, weeds are all the plants not classified as the main crop known a priori. The performance of the integrated robotic system has been tested in laboratory as well as in open field and in greenhouse conditions. The system was also tested under different light and shadowing conditions to evaluate the performance of the Deep Neural Network. Results show that the identification of the plants (both crop and weeds) is above 95%, increasing to 98% when additional information, such as the intra-row spacing, is provided. Nevertheless, the correct identification of the weeds remains above 97% ensuring an effective removal of weeds (up to 85%) with negligible crop damage (less than 5%).