To achieve a low error rate of NAND flash memory, reliable reference voltages should be updated based on the accurate knowledge of program/erase (P/E) cycles and retention time, because those severely distort the threshold voltage distribution of memory cell. Due to the sensitivity to the temperature, however, a flash memory controller is unable to acquire the exact knowledge of retention time, meaning that it is challenging to estimate accurate read reference voltages in practice. In this article, we propose a novel machine-learning-based read reference voltage estimation framework for the NAND flash memory systems without the knowledge of retention time. To establish an unknown input-output relation of the estimation model, we derive input features by sensing and decoding memory cells in the minimum read unit. In order to define the relation between unlabeled input features and a pre-assigned class label, namely label read reference voltages, we propose three mapping functions: 1) $k$ -nearest neighbors-based, 2) nearest-centroid-based, and 3) polynomial regression-based read reference voltage estimators. For the proposed estimation schemes, we analyze that the storage overhead and computational complexity are increasing function of the exploited feature dimension. Accordingly, we propose a feature selection (or dimension reduction) algorithm to select the minimum dimension and corresponding features to reduce the overhead and complexity while maintaining high estimation accuracy. Based on extensive numerical analysis, we validate that the derived features successfully replace unknown knowledge of retention time, and the proposed feature selection algorithm precisely adjusts the trade-off between overhead/complexity and estimation accuracy. Furthermore, the simulation and analysis results show that the proposed framework not only outperforms the conventional estimation schemes but also achieves the near-optimal frame error rate while sustaining low latency performance.