0
1.4kviews
Entropy and information rate.
1 Answer
| written 3.6 years ago by |
1] Entrophy : The entropy is defined as the average information per message. It is denoted by H and its units are bit/message.
The entropy must be as high as possible in order to ensure maximum transfer of information.
$Entrophy, H = \sum^m_k \ = \ 1 \ pk \ …