We propose a definition of granular count realized in the presence of uncertain data modeled through possibility distributions. We show that the resulting counts are fuzzy intervals in the domain of natural numbers. Based on this result, we devise two algorithms for granular counting: an exact counting algorithm with quadratic-time complexity and an approximate counting algorithm with linear-time complexity. We compare the two algorithms on synthetic data and show their application to a Bioinformatics scenario concerning the assessment of gene expressions in cells.