0
2.1kviews
Define o, $\omega$, and $\theta$notations. To find the complexity of given recurrence relation.

Define o, Ω, and $\theta$ notations. To find the complexity of given recurrence relation.

(i) $T(n) = 4T(n/2) +n^2$

(ii) $T(n) = 2T (n/2) +n^3$

1 Answer
0
20views

Big Oh Notation :

  1. Definition: A theoretical measure of the execution of an algorithm, usually the time or memory needed, given the problem size n, which is usually the number of items. Informally, saying some equation f (n) = O (g (n)) means it is less than some constant multiple of g(n).
  2. Consider an algorithm whose efficiency is

    $f (n)=8.023n^2+7n$

    at very large values of n, we may write

    $f (n) α n^2$

  3. If we designate n^2 as g(n we may say that as f(n) grows

    $f (n) α g(n)$

  4. The same fact can also be stated as f(n) is o(g(n)) and read as f(n) is Big-Oh of g(n)
  5. Mathematically, it means that there exists a positive real constant c such that

    $f (n) \lt= c*g(n)$, at very large values of n.

  6. Consider the following example where

    $f (n)=n^2 + 100n \\ n ^2 +100n\lt=2n^2, n\gt=100 \\ f (n) is O(n^2) \\ then g(n)=n^2, c=2$

  7. Hence as the value of n keeps an increasing f(n) becomes more and more smaller than some multiple of g(n).
  8. We may even say that f(n) is bounded by g(n) from above

    I.e. f (n) is asymptotically bounded by g (n).

  9. Consider an algorithm whose efficiency is

    $8.023n^2+7n+2.863n^ (2/5) %4.981n^ (5/9) +2.87n^ (8/9)$

  10. Now instead of expressing efficiency using this complex mathematical expression we may simply write:
  11. Efficiency | Complexity | f (n) is $O (n^2)$.
  12. The simplification of analysis from f (n) to g(n) is called Big Oh analysis.
  13. Hence, simplification is an advantage of Big Oh notation.

Omega Notation

  1. Definition: f (n) is said to be g (n) if and only if a positive real constant C such that. f (n) Cg(n) for infinitely many values of n. The notation describes asymptotic tight bounds.
  2. The big Omega notation always specifies a lower boundary on the amount of resources required. Hence, if f (n) is Ω (g (n)), then mathematically it means that there exists a positive real constant c such that f (n)>=c*g (n), at very large value of n.
  3. Consider an algorithm whose efficiency is

    $f (n)=3n^2-100n$

    Because $3n^2-100n\gt=2n^2$, at very large value of n therefore f (n) is $Ω (n^2)$.

  4. Hence the amount of resource required will be never less than $2n^2$.

Problems:

  1. T(n) =4T(n/2)+n^2

    Multiplying by 4 & $n-\gt (n/2)$

    $4T (n/2) =16T (n/2^2)+4((n^2)/4)$

    Multiply by $4^2$ & $n-\gt (n/(2^2))$

    $16T (n/ (2^2)) = 4^3T (n/2^3) +16*(n^2/16)$

    $4^ (k-1) T (n/2^ (k-1)) =4^kT (n/2^k) +n^2$

    Adding upto $log n or n=2^k$

    $T (n) =constant*T (n/n) +n^3*log n$

    $T(c) =c+n^2log n$

2) T(n)=2T(n/2)+n^3

Let the base two representation of n be

$n=∑_{k=0}^{log⁡n} d^k 2^k$

Then we can unroll the recurrence to obtain the following exact formula n>=1

$T(n) =∑_{j=0}^{log⁡n}2^j ((∑_{k=j}^{log⁡n} d^k 2^{k-j} ) ^3)$

Now to get an upper bound consider a string of one digits which yields

$T(n)\lt=∑_{j=0}^{log⁡n}2^j ((∑_{k=j}^{log⁡n} 2^{k-j} ) ^3)$

Note that this bound is attained and cannot be improved. It simplifies to

$(32/3)8^{log⁡n} -24*4^{log⁡n} +(6log n+(40/3))* 2^{og⁡n} +1$

The lower bound is for the case of a one digit followed by a string of zeros and yields

$T(n)\gt=∑_{j=0}^{log⁡n}2^j ((∑_{k=j}^{log⁡n} 2^{k-j} ) ^3)$

It simplifies to

$(4/3)8^{log⁡n} - (1/3) 2^{log⁡n}$

And this matches the results posted by the other contributors.

Joining the dominant terms of the upper and the lower bound we obtain the asymptotic

$2^{3 log⁡n} € Ө (2^{3 log⁡n}) = Ө(n^3)$

Please log in to add an answer.