1
7.1kviews
Give the proper definition for entropy and information rate.
1 Answer
| written 3.6 years ago by |
Entropy : The average information per message of a source m is called its entropy denoted by H(m) Hence,
$H(m) = \sum^n_i=1 \ pi \ 7i $ bits
= $\sum^n_i = 1 \ pi \ log \ (\frac{1}{pi}) $ bits
=$ - \sum^n_n = 1 \ pi \ log \ …