Asymptotic Notation 

Complexity analysis of an algorithm is very hard if we try to analyze exact. we know that the complexity (worst, best, or average) of an algorithm is the mathematical function of the size of the input. So, if we analyze the algorithm in terms of bound (upper and lower) then it would be easier. For this purpose, we need the concept of asymptotic notations. The figure below gives upper and lower bound concept. 


Big Oh (O) notation 

When we have only asymptotic upper bound then we use O notation. A function f(x)=O(g(x)) (read as f(x) is big oh of g(x) ) iff there exists two positive constants c and x0 such that for all 
x >= x0, 0 <= f(x) <= c*g(x) 


Asymptotic Notation of Big O
The above relation says that g(x) is an upper bound of f(x) 


Some properties: 
Transitivity: f(x) = O(g(x)) & g(x) = O(h(x)) _ f(x) = O(h(x)) 
Reflexivity: f(x) = O(f(x)) 
O(1) is used to denote constants.
For all values of n >= n0, plot shows clearly that f(n) lies below or on the curve of c*g(n) 
Examples 
Eg 1:-   f(n) = 3n^2 + 4n + 7 
g(n) = n^2 , then prove that f(n) = O(g(n)). 
Proof: let us choose c and n0 values as 14 and 1 respectively then we can have 
f(n) <= c*g(n), n>=n0 as 
3n2 + 4n + 7 <= 14*n2 for all n >= 1 
The above inequality is trivially true 
Hence f(n) = O(g(n)) 2.  
Eg 1:-   Prove that n log (n^3) is O(√n^3)). 
Proof: we have n log (n^3) = 3n log n 
Again, √n^3 = n √n, 
If we can prove log n = O(√n) then problem is solved
Because n log n = n O(√n) that gives the question again. 
We can remember the fact that log^a n is O (n^b) for all a,b>0. 
In our problem a = 1 and b = 1/2, 
hence log n = O(√n). 
So by knowing log n = O(√n) we proved that 
n log (n^3) = O(√n^3)).

Big Omega (Ω) notation 

Big omega notation gives asymptotic lower bound. A function f(x) =Ω (g(x)) (read as g(x) is big omega of g(x) ) iff there exists two positive constants c and x0 such that for all x >= x0, 0 <= c*g(x) <= f(x). 
Asymptotic Notation of Big Omega
The above relation says that g(x) is a lower bound of f(x). 

Some properties: 

Transitivity: f(x) = O(g(x)) & g(x) = O(h(x)) _ f(x) = O(h(x)) 
Reflexivity: f(x) = O(f(x))

Examples 
Eg 1:-   f(n) = 3n2 + 4n + 7 
g(n) = n^2 , then prove that f(n) =Ω(g(n)). 
Proof: let us choose c and n0 values as 1 and 1, respectively then we can have 
f(n) >= c*g(n), n>=n0 as 
3n^2 + 4n + 7 >= 1*n^2 for all n >= 1 
The above inequality is trivially true 
Hence f(n) =Ω(g(n))


Big Theta (Θ) notation 

When we need asymptotically tight bound then we use notation. A function f(x) = (g(x)) (read as f(x) is big theta of g(x) ) iff there exists three positive constants c1, c2 and x0 such that for all x >= x0, c1*g(x) <= f(x) <= c2*g(x) 
Asymptotic Notation of Big Theta
The above relation says that f(x) is order of g(x)

Some properties: 

Transitivity: f(x) = Θ(g(x)) & g(x) = Θ(h(x)) _ f(x) = Θ(h(x)) 
Reflexivity: f(x) = Θ(f(x))
Symmetry: f(x) = Θg(x) iff g(x) = Θf(x)
Example 1.  
f(n) = 3n2 + 4n + 7 
g(n) = n2 , then prove that f(n) = (g(n)). 
Proof: let us choose c1, c2 and n0 values as 14, 1 and 1 respectively then we can have,
f(n) <= c1*g(n), n>=n0 as 
3n^2 + 4n + 7 <= 14*n2 , and 
f(n) >= c2*g(n), n>=n0 as 3n^2 + 4n + 7 >= 1*n^2 
for all n >= 1(in both cases). 
So c2*g(n) <= f(n) <= c1*g(n) is trivial. 
Hence f(n) = Θ(g(n)).


Share To:

Arogya Thapa Magar

Post A Comment:

0 comments so far,add yours