'O(n) + O(n) = O(n)?
According to Alex Martelli in O'Reilly's Python in a Nutshell, the complexity class O(n) + O(n) = O(n). So I believe it. But am confused. He explains it by saying that, "the sum of two linear functions of N is also a linear function of N."
According to wikipedia, in functional analysis a linear function is a linear map, an example of which would be f(x+y) = f(x) + f(y). Found what seems to be a simpler definition here simply stating, "a linear function is a function who's graph is a straight line." And it includes a few more examples that are less esoteric-looking than the wikipedia article ones.
y = f(x) = a + bx
and:
y = 25 + 5x
let x = 1
then
y = 25 + 5(1) = 30
let x = 3
then
y = 25 + 5(3) = 40
Maybe it would be fair to expect that the sum of two linear equations could be represented on a graph as a straight line which shows something similar to an average between the straight lines that would represent each one.
So am I understanding correctly that even though in the following two functions, the "complex" function takes three times as long to process as the "simple" one, they each function in big-O notation as O(n), because the arc of the processing time would be represented by a straight, diagonal line on a plot/graph, and the time difference would be represented by the fact that on a graphic representation, the angle of the more complex function would be sharper?
from timer import Timer
def complex(it):
result = []
for i in it:
result.append(i)
result.reverse()
return result
def simple(it):
result = list(it)
longlist = list(range(0, 1000000))
with Timer() as t:
reverse_(longlist)
print "=> elapsed time reverse: %s." % t.secs
with Timer() as t:
straight(longlist)
print "=> elapsed time straight: %s" % t.secs
Solution 1:[1]
Correct, O(n) + O(n) = O(n).
More specifically, O(n) + O(n) = 2 * O(n), but since Big O only cares about functions as they tend toward infinity, anything linear is denoted as O(n).
Solution 2:[2]
So even though in the following two functions, the "complex" function takes three times as long to process as the "simple" one, they each function in big-O notation as O(n), because the arc of the processing time would be represented by a straight, diagonal line on a plot/graph, even though the angle of the more complex function would be sharper?
Yes. The letter O is used because it refers to the order of a function. Linear functions are of the same order, O(n), O(3n), O(Cn) are all linear.
On the other hand, O(n^2), O(n^3), and O(n^C) are all polynomial functions (of degree 2, 3, C). Here (when dealing with algorithms), we often do start making a distinction between things like O(n^2) and O(n^5) -- even though they are both of the same order.
And O(2^n), O(3^n), and O(C^n) are exponential. You generally don't want to write an algorithm with exponential complexity (or worse).
Solution 3:[3]
A good (the best?) way to go about it is to refer back to the mathematical definition of big-O:

In plain English:
These two statements are equivalent:
f is O(g)
The ratio of f(n) to g(n) as n increases tends to a nonnegative value.
In our case we have g(n) = n. Now if we let f(n) = f1(n) + f2(n) and assume both f1 and f2 are O(n), the limit above will equal ? = ?1 + ?2 which itself must be greater than or equal to zero (since by definition ?1 ? 0 and ?2 ? 0). Therefore f1(n) + f2(n) is also O(n), by our definition.
Solution 4:[4]
Yes, this is because Big-O notation is not meant to be used for the calculations of absolute running time, but to describe behaviour of algorithms with growing input.
In other words, if you have some algo that works in O(3n) and O(n) and you increase n however you want both of them will run as much longer.
It just gives a notion if one algorithms will out-scale another one at some point of increasing the input.
Of course mathematically everything can be proven using definitions.
Solution 5:[5]
Let's say the first O(n) is represented by an equation:
y1 = f1(x) = a1 + b1.x
and the second O(n) is represented by an equation:
y2 = f2(x) = a2 + b2.x
By adding the two sides,
y1 + y2 = f1(x) + f2(x) = (a1+a2) + (b1+b2).x
which shows that y1+y2 is also O(n).
Solution 6:[6]
That's correct. So long as the line on the graph is a straight one, the function is O(n), regardless of the angle. A function takes linear time when you can say "this function takes x seconds for every input item." x in this case could be three, or nine, or a million, and it's still a linear function.
Solution 7:[7]
Already there are a lot of good answers but I've not seen an answer from this angle. The sum of O(x) + O(y) is the worst of O(x) and O(y). in this case as both of them are linear, say x = C1n and y = C2n and C1 > C2. Thus x dominates the O() function and the big-O will be, O(C1n) => O(n)
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 | bryce |
| Solution 2 | |
| Solution 3 | Community |
| Solution 4 | luk32 |
| Solution 5 | R Sahu |
| Solution 6 | TheSoundDefense |
| Solution 7 | Fallen |
