This paper describes an efficient method of designing and implementing in FPGA devices complex tapped delay lines (CTDL) with pico and sub-picosecond resolution. Achieving a higher resolution and better linearity is possible by appropriate selection of single time coding tapped delay lines (TDL) involved in creation of CDTL. The proposed TDL selection algorithm significantly optimizes the size of the device’s logical resources required to implement CDTL with assumed parameters and provides a proper selection scenario. Ultimately, the presented solution allows to create CTDLs with different user-defined configurations based on a fixed set of available logical resources. Therefore, it is particularly recommended for prototyping in smaller FPGA devices. In this work, we investigate how the order of line selection influences the increase of the multiple time coding lines resolution. Furthermore, we determine the relation between the equivalent resolution value and the number of TDLs involved. Obtained results allow to estimate the upper limit of resolution that can be achieved using a given technology. In addition, the ranges of resolutions achievable with a fixed number of lines is also examined. The presented research results have been performed on a Kintex UltraScale FPGA chip, manufactured by Xilinx in the 20-nm CMOS process.