Abstract

Algorithms determine which calculations computers use to solve problems and are one of the central pillars of computer science. As algorithms improve, they enable scientists to tackle larger problems and explore new domains and new scientific techniques <xref ref-type="bibr" rid="ref1" xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">[1]</xref> , <xref ref-type="bibr" rid="ref2" xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">[2]</xref> . Bold claims have been made about the pace of algorithmic progress. For example, the President’s Council of Advisors on Science and Technology (PCAST), a body of senior scientists that advise the U.S. President, wrote in 2010 that “in many areas, performance gains due to improvements in algorithms have vastly exceeded even the dramatic performance gains due to increased processor speed” <xref ref-type="bibr" rid="ref3" xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">[3]</xref> . However, this conclusion was supported based on data from progress in linear solvers <xref ref-type="bibr" rid="ref4" xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">[4]</xref> , which is just a single example. With no guarantee that linear solvers are representative of algorithms in general, it is unclear how broadly conclusions, such as PCAST’s, should be interpreted. Is progress faster in most algorithms? Just some? How much on average?

Highlights

  • A variety of research has quantified progress for particular algorithms, including for maximum flow [5], Boolean satisfiability and factoring [6], and for linear solvers [4], [6], [7]

  • We consider an algorithm as an improvement if it reduces the worst case asymptotic time complexity of its algorithm family

  • We find enormous heterogeneity in algorithmic progress, with nearly half of algorithm families experiencing virtually no progress, while 14% experienced improvements orders of magnitude larger than hardware improvement

Read more

Summary

RESULTS

We focus on exact algorithms with exact solutions. We exclude such special cases from our analysis since they do not represent an asymptotic improvement for the full problem.. To focus on consequential algorithms, we limit our consideration to those families where the authors of a textbook, one of 57 that we examined, considered that family important. Based on these inclusion criteria, there are 113 algorithm families. We consider an algorithm as an improvement if it reduces the worst case asymptotic time complexity of its algorithm family. Based on this criterion, there are 276 initial algorithms and subsequent improvements, an average of 1.44 improvements after the initial algorithm in each algorithm family

Creating New Algorithms
Measuring Algorithm Improvement
Algorithmic Step Analysis
CONCLUSION
Algorithms and Algorithm Families
Historical Improvements
Calculating Improvement Rates and Transition Values
Transition Probabilities
Deriving the Number of Algorithmic Steps
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.