Semi-structured greenhouse environment often features repetitive and weak texture, naturally bringing challenges to high-precision vision-based positioning techniques. This paper proposes a precise visual positioning method for agricultural mobile robots in the greenhouse, which improves their positioning accuracy via discriminatively minimizing fiducial marker reprojection errors. First, fiducial markers are used to enhance environment features, and a markers-based visual positioning task is formulated as a Perspective-n-Point (PnP) problem. The projection constraints of keypoints and the pose constraints of the coordinate systems provide a theoretical basis for robot positioning. Second, a reprojection error minimization approach is proposed by taking into account the markers’ distance and image noise. Far-away markers are more prone to greater observation errors than those close to the robots, the improved PnP algorithm considering distance weighting ensures higher positioning accuracy. Synthetic and field experiments are carried out to evaluate the performance of the proposed method. Synthetic experiments show that the rotation error and translation error of the proposed method are less than 0.7° and 0.5% within a range of 12 m. The mean absolute error and root mean square error of field dynamic positioning experiments are 8.57 cm and 8.59 cm, respectively. Experimental results show that the proposed method is significantly better than traditional methods in dealing with distance-related noise at keypoints.