An algorithm is said to operate in-place if it uses only a constant amount of extra memory for storing local variables besides the memory reserved for the input elements. In other words, the size of the extra memory does not grow as the number of input elements, <i>n</i>, gets larger, but it is bounded by a constant. An algorithm reorders the input elements stably if the original relative order of equal elements is retained. In this thesis, we devise in-place algorithms for sorting and related problems. We measure the efficiency of the algorithms by calculating the number of element comparisons and element moves performed in the worst case in the following. The amount of index manipulation operations is closely related to these quantities, so it is omitted in our calculations. When no precise figures are needed, we denote the sum of all operations by a general expression "time". The thesis consists of five separate articles, the main contributions of which are described below. We construct algorithms for stable partitioning and stable selection which are the first linear-time algorithms being both stable and in-place concurrently. Moreover, we define problems stable unpartitioning and restoring selection and devise linear-time algorithms for these problems. The algorithm for stable un-partitioning is in-place while that for restoring selection uses O(<i>n</i>) extra bits. By using these algorithms as subroutines we construct an adaption of Quicksort that sorts a multiset stably in <i>O</i>(&Sigma;<sup><i>k</i></sup><sub><i>i</i> = 1</sub> <i>m<sub>i</sub></i> log(<i>n/m<sub>i</sub></i>)) time where <i>m<sub>i</sub></i> is the multiplicity of ith distinct element for <i>i</i> = 1,.., <i>k</i>. This is the first in-place algorithm that sorts a multiset stably in asymptotically optimal time. We present in-place algorithms for unstable and stable merging. The algorithms are asymptotically more efficient than earlier ones: the number of moves is 3(<i>n + m</i>)+<i>o(m)</i> for the unstable algorithm, 5<i>n</i>+12<i>m</i>+<i>o</i>(<i>m</i>) for the stable algorithm, and the number of comparisons at most <i>m</i>(<i>t</i> + 1) + <i>n</i>/2<sup>t</sup> + <i>o</i>(<i>m</i>) comparisons where m &le; <i>n</i> and <i>t</i> = [log(<i>n/m</i>)]. The previous best results were 1.125(<i>n + m</i>) + <i>o</i>(<i>n</i>) comparisons and 5(<i>n + in</i>) + <i>o(n)</i> moves for unstable merging, and 16.5(<i>n + in</i>) + <i>o(n)</i> moves for stable merging. Finally, we devise two in-place algorithms for sorting. Both algorithms are adaptions of Mergesort. The first performs <i>n</i> log<sub>2</sub> <i>n</i> + O(<i>n</i>) comparisons and &#949; <i>n</i> loge <i>n</i> + O(<i>n</i> log log <i>n</i>) moves for any fixed 0 &lt; &#949; &le; 2. Our experiments show that this algorithm performs well in practice The second requires <i>n</i> loge <i>n</i> + O(<i>n</i>) comparisons and fewer than O(<i>n</i> log <i>n</i>/log log <i>n</i>) moves. This is the first in-place sorting algorithm that performs <i>o</i>(<i>n</i> log <i>n</i>) moves in the worst case while guaranteeing O(<i>n</i> log <i>n</i>) comparisons.
Read full abstract