Feature selection, a common and crucial problem in current scientific research, is a crucial data preprocessing technique and a combinatorial optimization task. Feature selection aims to select a subset of informative and appropriate features from the original feature dataset. Therefore, improving performance on the classification task requires processing the original data using a feature selection strategy before the learning process. Particle swarm optimization, one of the metaheuristic algorithms that prevents the growth of computing complexity, can solve the feature selection problem satisfactorily and quickly with appropriate classification accuracy since it has local optimum escape strategies. There are arbitrary trial and error approaches described separately in the literature to determine the critical binary particle swarm optimization parameters, which are the inertial weight, the transfer function, the threshold value, and the swarm size, that directly affect the performance of the binary particle swarm optimization algorithm parameters used in feature selection. Unlike these approaches, this paper enables us to obtain scientific findings by evaluating all binary particle swarm optimization parameters together with the help of a statistically based factorial design approach. The results show how well the threshold and the transfer function have statistically affected the binary particle swarm optimization algorithm performance.