In high-dimensional linear regression models, common assumptions typically entail sparsity of regression coefficients β∈Rp. However, these assumptions may not hold when the majority, if not all, of regression coefficients are non-zeros. Statistical methods designed for sparse models may lead to substantial bias in model estimation. Therefore, this article proposes a novel Bayesian Grouping-Gibbs Sampling (BGGS) method, which departs from the common sparse assumptions in high-dimensional problems. The BGGS method leverages a grouping strategy that partitions β into distinct groups, facilitating rapid sampling in high-dimensional space. The grouping number (k) can be determined using the ‘Elbow plot’, which operates efficiently and is robust against the initial value. Theoretical analysis, under some regular conditions, guarantees model selection and parameter estimation consistency, and bound for the prediction error. Furthermore, three finite simulations are conducted to assess the competitive advantages of the proposed method in terms of parameter estimation and prediction accuracy. Finally, the BGGS method is applied to a financial dataset to explore its practical utility.