Abstract

Two new fast gradient algorithms are presented employing convergence factors which are optimized in the least-squares sense, and which perform two-dimensional block adaptive filtering. These two algorithms are termed the two-dimensional optimum block algorithm with individual adaptation of parameters (TDOBAI) and the two-dimensional optimum block adaptive algorithm (TDOBA). Using computer simulation, the convergence properties of the TDOBAI and TDOBA algorithm are investigated and compared with the two-dimensional block least-mean-square (TDBLMS) algorithm which uses a convergence factor that is constant for each 2D coefficient at each block iteration. It is also shown that for the TDOBAI and TDOBA algorithms, the convergence, speed and accuracy of adaptation are greatly improved at the expense of a modest increase in computational complexity, as compared to the TDBLMS algorithm. The effectiveness of the algorithms is demonstrated in 2D system modeling, restoration (2D additive noise cancellation) and enhancement of artificially degraded images. >

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call