1
7.1kviews
Give the proper definition for entropy and information rate.
1 Answer
2
239views

Entropy : The average information per message of a source m is called its entropy denoted by H(m) Hence,

$H(m) = \sum^n_i=1 \ pi \ 7i $ bits

= $\sum^n_i = 1 \ pi \ log \ (\frac{1}{pi}) $ bits

=$ - \sum^n_n = 1 \ pi \ log \ …

Create a free account to keep reading this post.

and 3 others joined a min ago.

Please log in to add an answer.