Thanks to the possibility of being able to implement them in decoders in relatively simple ways, turbo codes are being applied to various areas of engineering. Wireless communications is one of the most important applications, where various types of data communications are required and the speed of information and data capacity need to be changed with different rates of parity bit puncturing being adopted to obtain highly efficient transmission. In such applications, adaptation to various turbocoding specifications on a real-time basis is needed as well as good bit-error-rate performance. We present a new concept for simplifying metric computation and programmable circuit configurations that focuses on the convolutional decoder, which occupies a significant portion of allocated hardware, and we fundamentally treat a simplified log-domain version of the maximum a posteriori (MAP) algorithm, usually know as the Max-Log-MAP (MLM), as a base algorithm. The sliding window method provides an attractive way of computing metrie values for the Max-Log-MAP. However, this algorithm does cause degradation, especially when a high rate code is used, generated by bit puncturing. We propose a new means of avoiding this problem and demonstrate that the sliding window, and a modified version as well as other methods, should be flexibly selected to utilize hardware resources depending on turbo specifications. We demonstrated that our proposed methods provide almost the same BER performance as MLM even when a high rate puncturing rate of 5/6 is applied. Finally, we describe the new hardware architecture that we invented to cope with these programmability issues. We show that a turbo-decoding circuit can be implemented in the processor core and its associated unit to configure an LSI macro circuit. The proposed unit has about 60-K gates. We demonstrate that this architecture can be applied to about the 1.5-Mbps information bit throughput of turbo decoding with a 200-MHz clock cycle, which is achievable with today's advanced CMOS technologies.