Abstract

This paper investigates the motion estimation problem in video image sequences. The traditional motion estimation method adopted by the current video standards is based on the well known block-matching algorithm. This algorithm models the motion of all objects in the scene as translational movements involving thus important visual artefacts on the predicted image at very low bit-rate. To overcome these drawbacks, this paper proposes to estimate the motion vectors using rectangular deformable non-uniform meshes. Any image selected in the video sequence as a reference image is entirely tiled with non-overlapping and contiguous rectangular meshes adapted to the image content. The motion vectors are modelled by the displacements of each mesh node while preserving the mesh grid structure associated to the current image. A spatial transformation using warping function predicts the current image from its mesh nodes. In this particular context, we show that the matching (between reference and current images) error and the current image interpolation error are equal. Since the performance of a video sequence coding is directly affected by the motion estimation results, this paper concentrates on the improvement of the motion estimation accuracy by considering the problem of minimizing the prediction error with a constraint of a reasonable computational complexity. Some video image sequence examples show that the prediction error of the proposed method is smaller than some developed deformable-mesh-based motion methods

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.