Abstract In order to solve the problem of incomplete and inaccurate environment perception in the process of simultaneous localization and mapping for autonomous vehicle, this paper proposes a SLAM system for autonomous vehicle by fusing LiDAR and depth camera, installing a relatively cheap single-line LiDAR, depth camera and odometer on the robot, designing a SLAM framework, and using the localization data of LiDAR to assist visual map building for a loosely coupled fusion. It is demonstrated that the method makes the environmental information for building the map richer and more accurate compared with the single sensor mapping. It also improves the perception ability of the driverless trolley for the environment, and provides a better basis for the path planning.