Abstract

We study the problem of sorting N elements in the presence of persistent errors in comparisons: In this classical model, each comparison between two elements is wrong independently with some probability up to p, but repeating the same comparison gives always the same result. In this model, it is impossible to reliably compute a perfectly sorted permutation of the input elements. Rather, the quality of a sorting algorithm is often evaluated w.r.t. the maximum dislocation of the sequences it computes, namely, the maximum absolute difference between the position of an element in the returned sequence and the position of the same element in the perfectly sorted sequence. The best known algorithms for this problem have running time O(N2) and achieve, w.h.p., an optimal maximum dislocation of $O(\log N)$ for constant error probability p. Note that no algorithm can achieve maximum dislocation $o(\log N)$ w.h.p., regardless of its running time. In this work we present the first subquadratic time algorithm with optimal maximum dislocation. Our algorithm runs in $\widetilde {O}(N^{3/2})$ time and it guarantees $O(\log N)$ maximum dislocation with high probability for any p ≤ 1/16.

Highlights

  • We study the problem of sorting N distinct elements under recurrent random comparison errors

  • We present the first subquadratic time algorithm with optimal maximum dislocation, namely, an algorithm that runs in O(N 3/2) time and returns a sequence of maximum dislocation O(log N ) with high probability

  • Our second major contribution is a deterministic algorithm (Derandomized Recursive Window Sort) which still runs in O(N 3/2) and that returns a sequence of maximum dislocation O(log N ) with high probability

Read more

Summary

Introduction

We study the problem of sorting N distinct elements under recurrent random comparison errors. In this classical model, each comparison is wrong with some fixed (small) probability p, and correct with probability 1 − p. Regarding the maximum dislocation and the running time, in the recurrent random comparison errors, this is the state of the art: Several algorithms [3, 12, 9] guarantee maximum dislocation O(log N ) with high probability, though their running time is quadratic or even larger (see Table 1). No algorithm (even randomized) can achieve maximum dislocation o(log N ) with high probability, regardless of its running time [9]. In this paper we give an affirmative answer to this question

Our contribution
Preliminaries
Warm up
The algorithm
Running time
Derandomization
Derandomized Iterated Windowsort
N 3 ln N

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.