0
1.4kviews
Entropy and information rate.
1 Answer
0
15views

1] Entrophy : The entropy is defined as the average information per message. It is denoted by H and its units are bit/message.

The entropy must be as high as possible in order to ensure maximum transfer of information.

$Entrophy, H = \sum^m_k \ = \ 1 \ pk \ …

Create a free account to keep reading this post.

and 3 others joined a min ago.

Please log in to add an answer.