Abstract

Several new best case and worst case results are obtained for the complexity of sorting lists by the insertion sort algorithm. These results are in terms of various parameters that reflect the degree of disorder in the list being sorted, in addition to the traditional parameter of list length. Beyond the intrinsic interest of these complexities, they provide a convenient context for the study of how the precision of an analysis varies with the choice of analysis parameters. We characterize the imprecision of an analysis in terms of the difference Δ C = C w − C b between the worst case and best case complexity for the problem instances in the generic equivalence class induced by the problem parameters used in the analysis. We call this the inhomogeneity of the generic induced subclass. We consider various combinations of parameters, inducing corresponding subclass types called big, medium, small, tiny, and singleton classes. The precision of the different analyses relative to each other is quantified. For example, asymptotically as list length grows, we find that big, medium, tiny, small, and singleton subclasses have average inhomogeneity in the ratio 1 : 2 3 : 1 2 : 0: 0 . Thus, in particular, small classes and singleton classes both have zero inhomogeneity, or are perfectly homogeneous. This must be so for singleton classes because they contain only one member. It is interesting, however, that small classes, which contain many problem instances in general, are perfectly homogeneous, and that they are more homogeneous than tiny classes which are generally smaller. For our small classes, and any perfectly homogeneous classes, the best case, worst case and expected case complexities are all equal to each other and to the exact case complexity of all instances in the equivalence class. Analytic complexities for such classes are perfectly precise, and allow the exact prediction of complexity on an instance-by-instance basis. Less precise analyses (such as those over our big, medium, and tiny classes) may, however, be preferable, depending on the relative cost of extracting the values for the parameters used, the tractability of the analysis, and the difficulty of computing a value from the derived complexity expression. These tradeoffs and the implications of precision and homogeneity for complexity analysis in general, are discussed based on the present case study of the sorting problem and the author's related work on the constraint satisfaction problem.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.