Question: What do you mean by entropy. Derive the equation for entropy
0

Subject: Computer Engineering

Topic: Electronic Circuits and Communication Fundamentals

Difficulty: Medium / High

ADD COMMENTlink
modified 12 months ago  • written 12 months ago by gravatar for Sayali Bagwe Sayali Bagwe2.3k
0

Entropy:

It is defined as average information per msg. It is denoted by H and its units are bits/msg It must be as high as possible in order to ensure maximum transfer of info.

Msgs m1,m2…..

Prob p1,p2…..

L – sequence of L msg is generated.

$P_1$ L msgs of m1 are transmitted

$P_2$ L msgs of m1 are transmitted

.

.

.

PmL msgs of m are transmitted

The information conveyed by msg m1

$I_1= \log_2 [1/p_1]$

$I_{1(total)} =P_1 L \log_2⁡ \big[\frac{1}{P_1} \big]$

$I_{2(total)} =P_2 L \log_2⁡ \big[\frac{1}{P_2} \big]$

$I_total=I_1+I_2+I_3…$

Substituting the values of

$=P_1 L \log_2⁡p \big[\frac{1}{P_1} \big]+P_2 \big[\log_2⁡ \big[\frac{1}{P_2} \big]+P_3 L \log_2⁡ \big[\frac{1}{P_3}\big] \\ ∴H=\frac{I_{total}}{L}=P_1 \log_2⁡ \big(\frac{1}{P_1} \big)+⋯..$

Entropy = $H = ∑_{k=1 }^mPk \log_2⁡(1/Pk)$

ADD COMMENTlink
written 12 months ago by gravatar for Sayali Bagwe Sayali Bagwe2.3k
Please log in to add an answer.