Big-O Complexity#
\(O\) is known as the Big-Oh notion to comparatively analyze the time complexity of an algorithm.
Constant Time#
\(O(1)\): function with constant time complexity, which means the execution time of such a function is independent of its input sizes.
An example in Python is the function len(list)
.
This is a very simple algorithm becausethelistclass maintains, for each list, an instance variable that records the currentlength of the list.
This allows it to immediately report that length, rather than taketime to iteratively count each of the elements in the list.
Using asymptotic notation,we say that this function runs in \(O(1)\) time.
Another example is the indexing of a list in Python, such as data[j]
.
The indexing operation also has constant-time complexity.
Logorithm Time#
\(O(logn)\): logarithm time
Linear Time#
\(O(n)\): the time for the linear time algorithm to solve a problem linearly grows with the size of the problem.
The following function fun_max2
is used to find the maximum value from a given list.
L2
is an indexing operation and an assignment, and both operations are constant-time \(O(1)\).
L3
is a for loop, which executes n
times (n
is the length of the given list), and needs \(O(n)\) time during operation.
Inside the loop, there is a comparison and a possible assignment operation, both of which require \(O(1)\) time.
Thus the whole loop requires \(O(n)\) time.
L6
is a return operation, which needs \(O(1)\) time.
Therefore, in total, this complexity of this algorithm is \(O(1)+O(n)*O(1)+O(1)\), which is \(O(n)\).
1 def find_max2(l):
2 max_l = l[0]
3 for element in l:
4 if element>max_l:
5 max_l = element
6 return max_l
n-log-n Time#
\(O(nlogn)\): sorting algrithm can be n-log-n time.
Qudratic Time#
\(O(n^2)\): the time for the quadratic time algorithm to solve a problem quadratically grow with the size of the problem.
An example is to find the prefix averages
of a sequence of numbers.
Given a sequence \(S\) consisting of \(n\) numbers, we want to calculate a sequence \(A\) such that \(A[j]\) is the average of elements \(S[0], S[1], ..., S[]j\), for \(j = 0,1, ..., n-1\), that is,
\(A[j] = \frac{\sum_{i=0}^{j}S[i]}{j+1}\)
The following algorithm prefix_average1
renders a comlexity of \(O(n^2)\), which computes every element of \(A\) separately, using an inner loop to compute the partial sum.
L2
executes in constant time.L3
initializes a Python list with length of \(n\) with entries of 0. This uses a constant number of primitive operations per element and thus run in \(O(n)\) time.L4
is an outer loop that executes \(n\) times. The body of the outer loop inL5, L8
executes in constant time.L6, L7
is an inner loop that execute \(j+1\) times, depending the current counter \(j\). ThusL7
executes \(1+2+3+, ..., +n = \frac{n(n+1)}{2}\) times in total within the outer loop, which is \(O(n^2)\) for \(L4-L7\).Thus, the code in toal is \(O(1) + O(n) + O(n^2)\), and considerd as \(O(n^2)\).
1 def prefix_average1(S):
2 n = len(S)
3 A = [0.]*n
4 for j in range(n):
5 total = 0
6 for i in range(j+1):
7 total += S[i]
8 A[j] = total/(j+1)
9 return A
The above algorithm can be improved to \(O(n)\), which is shown in prefix_average2
.
L2
andL4
are constant time.L3
runs in \(O(n)\) time.L5
executes \(n\) times, with constant time operations inL6
andL7
.Thus, the code in total is \(O(n)\).
1 def prefix_average2(S):
2 n = len(S)
3 A = [0.]*n
4 total = 0
5 for j in range(n):
6 total += S[j]
7 A[j] = total/(j+1)
8 return A
Cubic Time#
\(O(n^3)\): an example of cubic time algorithms is a naive algorithm to solve the three-way set disjointness
problem.
Suppose we are given three sequences of numbers, \(A, B, C\).
We need to determine if the intersection of the three sequences is empty, namely, that there is no \(x\) such that \(x \in A, x \in B\) and \(x \in C\).
A naive way to sovle this problem is shown in disjoint1
as follows, where the complexity is \(O(n^3)\) if \(A, B, C\) have the same size \(n\).
1 def disjoint1(A,B,C):
2 for a in A:
3 for b in B:
4 for c in C:
5 if a==b==c:
6 return False
7 return True
We can improve the complexity to \(O(n^2)\) using disjoint2
.
The idea is that we don’t need loop over C if we already know \(a \neq b\), which can save computation time.
This algorithm is \(O(n^2)\).
L2
manages an outer loop over A, which requires \(O(n)\) time.L3
loops over B, which accounts for a total of \(O(n^2)\) since that loop is executed \(n\) different times.L4
is exectued \(O(n^2)\) times.L5-L7
depends on the number of matching pairs \((a,b)\), whose maximum is \(n\). Therefore, this block will at most be executed \(O(n^2)\) times.
1 def disjoint2(A,B,C):
2 for a in A:
3 for b in B:
4 if a == b:
5 for c in C:
6 if a == c:
7 return False
8 return True
Exponential Time#
\(O(2^n)\): exponential time. Cannot think of an example now.