Product quantization (PQ) is a powerful technique for approximate nearest neighbor (ANN) search. In this paper, to improve the accuracy of ANN search, we propose a new PQ-based method named product quantization with dual codebooks (DCPQ). Different from traditional PQ-based methods, we analyze quantization errors after learning the first PQ codebook, and then part of training vectors with larger quantization errors are found and selected to relearn a second PQ codebook. When encoding the database offline, all database vectors are firstly quantized using both of dual codebooks in each subspace, and the encoding mode of a database vector is determined after comparing the two quantization errors based on dual codebooks. Moreover, database vectors with the same encoding mode are grouped as a sub-database and can be more efficiently searched. Experimental results demonstrate that our proposed dual codebooks solution can achieve higher accuracy compared with the standard PQ and its variants.