**1 Answer**

0

927views

Entropy and information rate.

0

14views

written 2.5 years ago by |

**1] Entrophy :** The entropy is defined as the average information per message. It is denoted by H and its units are bit/message.

The entropy must be as high as possible in order to ensure maximum transfer of information.

$Entrophy, H = \sum^m_k \ = \ 1 \ pk \ log_2 \ (\frac{1}{pk})$

**2] Information Rate (R) :**

If the source of the messages generates "r" number of messages per second then the information rate is given as

**R = r x H**

where r $\rightarrow$ Number of messages/sec

H $\rightarrow$ Average information/message

$R = [ r \frac{messages}{seconds}] \times [ H \frac{information}{message}]$

R $\rightarrow$ Average information per second expressed in bit/sec

ADD COMMENT
EDIT

Please log in to add an answer.