AbstractIn high dimensional regression, clustering features according to their effects on outcomes is often as important as feature selection. For example, insurance premiums are set for each rate class pertaining to risk factors related to claim risk. To calculate reliable insurance premiums, it is often necessary to group the numerous rate classes into fewer classes. However, the combinations of ways to consolidate rate classes are vast, and it is computationally challenging to consider each combination individually. Under such circumstances, sparse regularization techniques for feature clustering are extremely useful as methods that automatically consolidate rate classes with no significant differences in risk levels during the estimation process. For this purpose, clustered Lasso and octagonal shrinkage and clustering algorithm for regression (OSCAR) can be used to generate feature groups automatically using pairwise $$L_1$$ L 1 norm and pairwise $$L_\infty $$ L ∞ norm, respectively. This paper proposes efficient path algorithms for clustered Lasso and OSCAR to construct solution paths with respect to their regularization parameters. Despite the excessive terms in exhaustive pairwise regularization, their computational costs are reduced by using the symmetry of those terms. Simple equivalent conditions to check subgradient equations in each feature group are derived using the graph theory. The proposed algorithms are shown to be more efficient than existing algorithms via numerical experiments.