Virtual surgical training is crucial for enhancing minimally invasive surgical skills. Traditional geometric reconstruction methods based on medical CT/MRI images often fall short in providing color information, which is typically generated through pseudo-coloring or artistic rendering. To simultaneously reconstruct both the geometric shape and appearance information of organs, we propose a novel organ model reconstruction network called Endoscope-NeSRF. This network jointly leverages neural radiance fields and Signed Distance Function (SDF) to reconstruct a textured geometric model of the organ of interest from multi-view photometric images acquired by an endoscope. The prior knowledge of the inverse correlation between the distance from the light source to the object and the radiance improves the real physical properties of the organ. The dilated mask further refines the appearance and geometry at the organ's edges. We also proposed a highlight adaptive optimization strategy to remove highlights caused by the light source during the acquisition process, thereby preventing the reconstruction results in areas previously affected by highlights from turning white. Finally, the real-time realistic rendering of the organ model is achieved by combining the inverse rendering and Bidirectional Reflectance Distribution Function (BRDF) rendering methods. Experimental results show that our method closely matches the Instant-NGP method in appearance reconstruction, outperforming other state-of-the-art methods, and stands as the superior method in terms of geometric reconstruction. Our method obtained a detailed geometric model and realistic appearance, providing a realistic visual sense for virtual surgical simulation, which is important for medical training.