In machine learning it is often necessary to assume or know the distribution of the data, however it is difficult to do so in practical applications. Aiming to this problem, this work, we propose a novel distribution-free Bayesian regularized learning framework for semi-supervised learning, which is called Hessian regularized twin minimax probability extreme learning machine (HRTMPELM). In this framework, we attempt to construct two non-parallel hyperplanes by introducing the high separation probability assumption, such that each hyperplane separates samples from one class with maximum probability while moving away from samples from the other class. Subsidiently, the framework can be utilized to construct reasonable semi-supervised classifiers by using the information of the inherent geometric distribution of the samples through the Hessian regularization term. Additionally, the proposed framework controls the misclassification error of samples by minimizing the upper limit of the worst-case misclassification probability, and improves the generalization performance of the model by introducing the idea of regularization to avoid the occurrence of ill-posedness and overfitting problems. More importantly, the framework has no hyperparameters, making the learning process very simplified and efficient. Finally, a simple and reliable algorithm with globally optimal solutions via multivariate Chebyshev inequalities is designed for solving the proposed learning framework. Experiments on multiple datasets demonstrate the reliability and effectiveness of the proposed learning framework compared to other methods. Especially, we applied the framework to Ningxia wolfberry quality detection, which greatly enriches and facilitates the application of machine learning algorithms in the agricultural field.