0
964views
What do you mean by entropy. Derive the equation for entropy
1 Answer
0
6views

Entropy:

It is defined as average information per msg. It is denoted by H and its units are bits/msg It must be as high as possible in order to ensure maximum transfer of info.

Msgs m1,m2…..

Prob p1,p2…..

L – sequence of L msg is generated.

$P_1$ L msgs of m1 are transmitted

$P_2$ L msgs of m1 are transmitted

.

.

.

PmL msgs of m are transmitted

The information conveyed by msg m1

$I_1= \log_2 [1/p_1]$

$I_{1(total)} =P_1 L \log_2⁡ \big[\frac{1}{P_1} \big]$

$I_{2(total)} =P_2 L \log_2⁡ \big[\frac{1}{P_2} \big]$

$I_total=I_1+I_2+I_3…$

Substituting the values of

$=P_1 L \log_2⁡p \big[\frac{1}{P_1} \big]+P_2 \big[\log_2⁡ \big[\frac{1}{P_2} \big]+P_3 L \log_2⁡ \big[\frac{1}{P_3}\big] \\ ∴H=\frac{I_{total}}{L}=P_1 \log_2⁡ \big(\frac{1}{P_1} \big)+⋯..$

Entropy = $H = ∑_{k=1 }^mPk \log_2⁡(1/Pk)$

Please log in to add an answer.