ABSTRACTPenalised likelihood methods have been a success in analysing high dimensional data. Tang and Leng [(2010), ‘Penalized High-Dimensional Empirical Likelihood’, Biometrika, 97(4), 905–920] extended the penalisation approach to the empirical likelihood scenario and showed that the penalised empirical likelihood estimator could identify the true predictors consistently in the linear regression models. However, this desired selection consistency property of the penalised empirical likelihood method relies heavily on the choice of the tuning parameter. In this work, we propose a tuning parameter selection procedure for penalised empirical likelihood to guarantee that this selection consistency can be achieved. Specifically, we propose a generalised information criterion (GIC) for the penalised empirical likelihood in the linear regression case. We show that the tuning parameter selected by the GIC yields the true model consistently even when the number of predictors diverges to infinity with the sample size. We demonstrate the performance of our procedure by numerical simulations and a real data analysis.