Concomitant with the significant advances in computing technology, the utilization of augmented reality-based navigation in clinical applications is being actively researched. In this light, we developed novel object tracking and depth realization technologies to apply augmented reality-based neuronavigation to brain surgery. We developed real-time inside-out tracking based on visual inertial odometry and a visual inertial simultaneous localization and mapping algorithm. The cube quick response marker and depth data obtained from light detection and ranging sensors are used for continuous tracking. For depth realization, order-independent transparency, clipping, and annotation and measurement functions were developed. In this study, the augmented reality model of a brain tumor patient was applied to its life-size three-dimensional (3D) printed model. Using real-time inside-out tracking, we confirmed that the augmented reality model remained consistent with the 3D printed patient model without flutter, regardless of the movement of the visualization device. The coordination accuracy during real-time inside-out tracking was also validated. The average movement error of the X and Y axes was 0.34 ± 0.21 and 0.04 ± 0.08mm, respectively. Further, the application of order-independent transparency with multilayer alpha blending and filtered alpha compositing improved the perception of overlapping internal brain structures. Clipping, and annotation and measurement functions were also developed to aid depth perception and worked perfectly during real-time coordination. We named this system METAMEDIP navigation. The results validate the efficacy of the real-time inside-out tracking and depth realization technology. With these novel technologies developed for continuous tracking and depth perception in augmented reality environments, we are able to overcome the critical obstacles in the development of clinically applicable augmented reality neuronavigation.