Abstract

An extremely large data file is a file that is greater than the size of the main memory by multiple orders of magnitude. Sorting such a file involves external sorting algorithm, which uses both the hard disk and the main memory to accomplish the sorting task. Since the hard disk is much slower than the main memory, the number of hard disk input/output operations is considered the main performance metric. The new proposed method decreases the total number of input/output operations; hence, it reduces the total time of sorting. The proposed method has less number of disk read/write operations than currently existing approaches. The input/output complexity of the proposed algorithm is analyzed and compared with other algorithms. The proposed algorithm uses a constant merging order at the merge phase of the external sort with multiple passes over each set of data. It is shown that the proposed algorithm has lower sort time requirements than previous approaches.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.