How do you calculate the time complexity of an algorithm (in "Big-O" notation)? I read something on the Internet just now that said to count the number of operations; so, I wrote a very quick bubblesort implementation and counted. I got n2+15 (this is counting things like comparisons, declarations, assignments, etc.).
I tried to find a good article about it but couldn't. Could someone explain to me, or else point me to a good explanation?
Manipulate your variables, determine a best and worst case scenario, use those, plot a few graphs vs time, see what those graphs resemble. At least, that's what I do.
I'm by no means an expert on algorithms but I just look at the number of iterations... The easiest ones to spot are for loops. Here are some examples:
1 2 3
{
/* something that executes once is done in O(1) or constant time */
}
1 2 3 4 5
for( int n = 0; n < 10; ++n ) // note that as n increases, the number of iterations increases
// at a fixed rate
{
/* something that executes n times is done in O(n) or linear time */
}
1 2 3 4 5 6 7
for( int n = 0; n < 10; ++n )
{
for( int n = 0; n < 10; ++n )
{
/* something that executes n*n times is done in O(n2) or quadratic time */
}
}
The following is also quadratic time, just like a quadratic expression in algebra. The largest term is the one that counts, Big-O notation usually drops the lesser terms, so O(n2+n) would just be called O(n2).
1 2 3 4 5 6 7 8 9 10 11
for( int n = 0; n < 10; ++n )
{
for( int n = 0; n < 10; ++n )
{
/* something that executes n*n times is done in O(n2) or quadratic time */
}
}
for( int n = 0; n < 10; ++n )
{
/* something that executes n times is done in O(n) or linear time */
}
Binary searches are O(log n) or logarithmic time. Etc.