Second-order cone optimization (SOCO) problems are crucial in various fields such as engineering, finance, and machine learning due to their ability to model complex convex optimization tasks efficiently. In this article, we propose an interior point algorithm tailored for SOCO, leveraging a novel kernel function to enhance optimization performance. Traditional interior point methods for SOCO often encounter challenges in scalability and efficiency, particularly with large-scale problems. These methods typically rely on barrier functions that may suffer from slow convergence rates, especially when dealing with highly ill-conditioned or degenerate cones. To address these limitations, our approach introduces a new kernel function that exploits the geometric properties of second-order cones, thereby improving convergence behavior and computational efficiency. The key innovation of our algorithm lies in the construction of the kernel function, which incorporates insights from the structure of second-order cones to efficiently capture the curvature information of the optimization problem. By exploiting this curvature information, our algorithm can effectively navigate the optimization landscape, leading to faster convergence and enhanced robustness compared to traditional methods. Furthermore, our algorithm is equipped with advanced techniques for handling various challenges encountered in SOCO, such as dealing with degenerate cones and exploiting problem structure to accelerate convergence. Additionally, we incorporate strategies for warm-starting and adaptive step-size selection to further improve efficiency, particularly in iterative optimization processes. Finally, our proposed interior point algorithm for second-order cone optimization, based on a new kernel function, derives the iteration bounds O (Nlog (N/∊)) for large update methods. Here denotes N the number of second-order cones in the problem formulation and ∊ the desired accuracy. These iteration bounds are currently the best known bounds for such methods.