What is the complexity of T(n)=1+2+3+...+n? I know that the answer is O(n^2). What is an example of an algorithm that has runtime T(n)?
EDIT: I was not talking about computing the sum 1+2+3+...+n, that is not the objetive.
What is an example of an algorithm that has runtime T(n)?
If you have an outer loop that iterates n times and an inner loop that iterates i times where i is the index of the outer loop, the body of the inner loop will be executed T(n) times.
An example of such a nested loop would be the following algorithm:
for i from 1 to n
for j from 1 to i
print "$j "
print "\n"
Which is the solution to a common homework assignment and prints a number pyramid of the following shape:
1
1 2
1 2 3
print "$j " is O(1) since it outputs ~log_10(j) characters. If you count outputting 1 character as the basic operation, then the example is O(n^2 log n). If you consider the print statement as O(1) no matter what's being printed then it's O(n^2).T(n) times (though admittedly that wasn't quite what was asked for) ;-) You can solve that problem by printing "* " instead of "$j ". That might actually be the more common variant of the homework assignment anyway.While @sepp2k gives an awesome answer, here is some real life algorithms which may have T(N) = 1+2+3+...+n(thus we called such algorithm O(n^2))
Insertion Sort at its worst case:
It has a outer loop which loop n times, and for inner loop, it loops the sorted portion of the array to find a position to insert the next element, where the sorted portion increases by 1 for each outer loop iteration, so the inner loop at worst case will runs 1+2+3+4+..+n-1 times
Longest Increasing Subsequence(LIS) using naive implementation: Direct implementation of the recurrence relation
which for each iteration i, we have to look through all j < i
Interval Graph Coloring(Interval Partitioning) using naive implementation: After sorting the intervals, for each interval x, we have to find how many intervals before x conflicting with x, and the answer to the problem is the maximum conflict number. So while the outer loop is looping each interval i, the inner loop is looping for all j < i
Prime Generating using naive primality test: Of course, we can generate all primes within n, using naive primality test on all i within n. That is, for all i, we loop for all j < i to see if j divides i
Indeed there are many algorithms which contains such structure, and most of them I have seen can be improved by using a more brilliant algorithm or advanced data structure.
The order of growth of a function has to do with the number of operations you need to complete. An example of a function with O(n) would be the slow version of the function you mentioned in the question. In python:
total = 0
for i in range(n + 1):
total += i
This would be O(n) because the amount of time it takes to complete will scale linearly based on n.
As for the function you gave as an example in the question, it's actually O(1) because you only need to do one operation: n(n+1)/2. The amount of time it takes to compute n(n+1)/2 will never change bar back-end quirks, but for something like this, that really shouldn't ever matter.
T. T was always intended as a mathematical function that describes some algorithm's time complexity. That's why it's called T and why the OP asks (and always asked) for "an example of an algorithm that has runtime T(n)". The phrase "has runtime T(n)" makes no sense when interpreting the question any other way. The function T(n) = 1 + ... + n is definitely in O(n^2). That is, if you have an algorithm that takes 1 + ... + n steps for an input of size n, that algorithm has quadratic time complexity.
O(n), why do you think it'sO(n^2)?n(n+1)/2is simply a constant which isO(1). To be clear heren != O(n), that is referring to the actual numbern.