Image-guided surgical navigation systems are widely regarded as the benchmark for computer-assisted surgical robotic platforms, yet a persistent challenge remains in addressing intraoperative image drift and mismatch. It can significantly impact the accuracy and precision of surgical procedures. Therefore, further research and development are necessary to mitigate this issue and enhance the overall performance of these advanced surgical platforms. The primary objective is to improve the precision of image guided puncture navigation systems by developing a computed tomography (CT) and structured light imaging (SLI) based navigation system. Furthermore, we also aim to quantifying and visualize intraoperative image drift and mismatch in real time and provide feedback to surgeons, ensuring that surgical procedures are executed with accuracy and reliability. A CT-SLI guided orthopedic navigation puncture system was developed. Polymer bandages are employed to pressurize, plasticize, immobilize and toughen the surface of a specimen for surgical operations. Preoperative CT images of the specimen are acquired, a 3D navigation map is reconstructed and a puncture path planned accordingly. During surgery, an SLI module captures and reconstructs the 3D surfaces of both the specimen and a guiding tube for the puncture needle. The SLI reconstructed 3D surface of the specimen is matched to the CT navigation map via two-step point cloud registrations, while the SLI reconstructed 3D surface of the guiding tube is fitted by a cylindrical model, which is in turn aligned with the planned puncture path. The proposed system has been tested and evaluated using 20 formalin-soaked lower limb cadaver specimens preserved at a local hospital. The proposed method achieved image registration RMS errors of 0.576 ± 0.146 mm and 0.407 ± 0.234 mm between preoperative CT and intraoperative SLI surface models and between preoperative and postoperative CT surface models. In addition, preoperative and postoperative specimen surface and skeletal drifts were 0.033 ± 0.272 mm and 0.235 ± 0.197 mm respectively. The results indicate that the proposed method is effective in reducing intraoperative image drift and mismatch. The system also visualizes intraoperative image drift and mismatch, and provides real time visual feedback to surgeons.