Abstract
We present a new GCD algorithm for two integers that combines both the Euclidean and the binary gcd approaches. We give its worst case time analysis and we prove that its bit-time complexity is still O(n 2 ) for two n-bit integers in the worst case. Our preliminar experiments show a potential speedup for small integers. A parallel version matches the best presently known time complexity, namely O(n/ log n )t ime with O(n 1+� ) processors, for any constant �> 0.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have