'Time complexity of dividing n to the point it is to small to be recognized as positive number

I have to check the time complexity of the following code:

while n > 0 :
n //= 2

Of course the n will be divided to the moment when it will be so small the computer will recognize it as 0. Something tells me that the time complexity of it is logn but I'm not sure how to prove it. Any tips on that?



Solution 1:[1]

I think reasoning in the opposite way could be more intuitive:

You start from a certain n?, that is the lowest positive number recognized, and at each step (i.e., for each unit of time t) you multiply it by 2 until, at a certain time t*, you reach a certain n*, that in your question corresponds to the original value of n.

The value of n* will be:

n?*2*2*2*2*2*...*2*2*2*2*2
   |___number of steps___|

i.e., n* = n?*2^(t*). In general, we have that n = n?*2^t and, reasoning asymptotically, since n? is a constant factor, we have that n?O(2^t).

Thus, doing a re-inversion, we have that the time complexity is logarithmic w.r.t. n (indeed, it is O(log?(n))).

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1