Collapsing different category numbers within attitudinal measurement on learning and instruction can be an option of physics education research (PER) scholars while interpreting categorical responses from participants. Psychometric evaluation of the different category numbers, however, has inconclusive results to date. Item response theory (IRT) offers psychometric framework at the item level that has been approached in this study to explore psychometric properties of five different number of categories. One thousand artificial data within five number of categories were generated based on the underlying marginal distribution and the inter item correlation matrix. Statistical parameters [item estimates and test information functions (TIFs)] from Graded Response Model (GRM) were employed to describe the psychometric behaviors of the studied category numbers. Our findings demonstrated that the discrimination index among the five category numbers remained constant. Global fit indices had no distinguishable information to differentiate the varying category numbers. Based on the item location and TIF, our results confirmed previous works that more information will be raised by the greater number of categories. The implication intended by this study should offer a key recommendation for a case from attitudinal measurement on the PER community or even beyond the psychometric field.