'O(log n) algorithm in theory runs much slower in practice
The algorithm is for calculating 2^n recursively. I have used the Master theorem to determine that the time complexity is indeed O(log n) which seems to correct according to other sources online.
def pow(n):
"""Return 2**n, where n is a nonnegative integer."""
if n == 0:
return 1
x = pow(n//2)
if n%2 == 0:
return x*x
return 2*x*x
My problem is that it does not run in logarithmic time when i measure it in python for many and largevalues of n. Instead, it seems to be running at something like O(n) or even O(n*logn) time. Is this because of the large multiplication in the return statements or something else? If so, can the algortithm even be considerd O(log n) if that is not what determines the final outcome in terms of speed?
Solution 1:[1]
The complexity is O(log n) assuming each of the numbers fits in one word of memory. That assumption doesn’t hold if the sizes of the numbers increase.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
Solution | Source |
---|---|
Solution 1 | Ashwin Ganesan |