ASYMPTOTIC ANALYSIS
- Asymptotic analysis of an algorithm refers to defining the mathematical foundation /framing of its run-time performance. Using asymptotic analysis, we can very well conclude the best case, average case, and worst case scenario of an algorithm.
- Actually, The set of instructions is known as an algorithm.
- Asymptotic analysis is input bound i.e., if there's no input to the algorithm, it is concluded to work in a constant time. Other than the "input" all other factors are considered constant.
- Asymptotic analysis refers to computing the running time of any operation in mathematical units of computation.
- For example, the running time of one operation is computed as f(n) and may be for another operation it is computed as g(n2).
- This means the first operation running time will increase linearly with the increase in n and the running time of the second operation 4 will increase exponentially, when n increases. Similarly, the running time of both operations will be nearly the same if n is significantly small
- The time required by an algorithm falls under three types:
- Best Case : - Minimum time required for a program execution.
- Average Case: - Average time required for a program execution.
- Worst Case : - Maximum time required for a program execution.
Complexity of an Algorithm Types:
Best Case:
- In the best case analysis, we calculate the lower bound of the execution time of an algorithm. It is necessary to know the case which causes the execution of the minimum number of operations. In the linear search problem, the best case occurs when x is present at the first location.
- The number of operations in the best case is constant. The best-case time complexity would therefore be Θ (1) Most of the time, we perform worst-case analysis to analyze algorithms. In the worst analysis, we guarantee an upper bound on the execution time of an algorithm which is good information.
Average Case:
- In the average case analysis, we take all possible inputs and calculate the computation time for all inputs. Add up all the calculated values and divide the sum by the total number of entries.
- We need to predict the distribution of cases. For the linear search problem, assume that all cases are uniformly distributed. So we add all the cases and divide the sum by (n + 1).
Worst Case:
- In the worst-case analysis, we calculate the upper limit of the execution time of an algorithm. It is necessary to know the case which causes the execution of the maximum number of operations.
- For linear search, the worst case occurs when the element to search for is not present in the array. When x is not present, the search () function compares it with all the elements of arr[] one by one. Therefore, the temporal complexity of the worst case of linear search would be Θ (n).
For some algorithms, all cases are asymptotically the same, that is, there is no worst and best
case. For example, Sort by merge. Merge sorting performs Θ (nlogn) operations in all cases.
Most of the other sorting algorithms present the worst and best cases.
For example, in the typical quicksort implementation, the worst occurs when the input array is
already sorted and the best occurs when the pivot elements always divide the table into two halves.
For insert sorting, the worst case occurs when the array is sorted in reverse order and the best
case occurs when the array is sorted in the same order as the output.
ASYMPTOTIC NATATIONS
- Asymptotic notations are the mathematical notations used to describe the running time of an algorithm when the input tends towards a particular value or a limiting value.
- For example: In bubble sort, when the input array is already sorted, the time taken by the algorithm is linear i.e. the best case.
- But, when the input array is in reverse condition, the algorithm takes the maximum time (quadratic) to sort the elements i.e. the worst case. When the input array is neither sorted nor in reverse order, then it takes average time.
- These durations are denoted using asymptotic notations.
Following are the commonly used asymptotic notations to calculate the running time
complexity of an algorithm.
- Big Oh Notation (O)
- Omega Notation (Ω)
- Big Theta Notation (θ)
Big Oh Notation (O)
- According to asymptotic notation is the formal way to express the under bound of an algorithm's running time. It measures the worst case time complexity or the longest amount of time an algorithm can possibly take to complete.
- Let's see an example:
- For Example, for a function f(n)
- O (f(n)) = {g(n there are exists c>0 and n0 such that f(n) ≤ c.g(n) for all n>n0 }
Omega Notation (Ω) :
- The notation Ω(n) is the formal way to express the lower bound of an algorithm's running time. It measures the best case time complexity or the best amount of time an algorithm can possibly take to complete.
- Let's see an example:
- for a function f(n) Ω(f(n)) ≥{g(n): there exists c >0 and n0 such that g(n) ≤ c.f(n) for all n > n0}.
Theta Notation(θ):
- The notation θ(n) is the formal way to express both the lower bound and the upper bound of an algorithm's running time.
- It is represented as follows:
- θ(f(n)) = { g(n) if and only if g(n) = Ο(f(n)) and g(n) = Ω(f(n)) for all n > n0},
Comments