Abstract

The cyclic convolution algorithms of chapter 6 are efficient for special small block lengths, but as the size of the block length increases, other methods are required. First as discussed in chapter 6, these algorithms keep the number of required multiplications small, but can require many additions. Also, each size requires a different algorithm. There is no uniform structure that can be repeatedly called upon. In this chapter, a technique similar to the Good-Thomas PFA will be developed to decompose a large size cyclic convolution into several small size cyclic convolutions which in turn can be evaluated using the Winograd cyclic convolution algorithm. These ideas were introduced by Agarwal and Cooley [1] in 1977. As in the Good-Thomas PFA, the CRT is used to define an indexing of data. This indexing changes a one-dimensional cyclic convolution into a two-dimensional cyclic convolution. We will see how to compute a two-dimensional cyclic convolution by ‘nesting’ a fast algorithm for one-dimensional cyclic convolution inside another fast algorithm for one-dimensional cyclic convolution. There are several two-dimensional cyclic convolution algorithms which although important will not be discussed. These can be found in [2].KeywordsDiscrete Fourier TransformFast AlgorithmPermutation MatrixCirculant MatrixConvolution TheoremThese keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call