Question: Prove that the entropy of extremely likely and extremely unlikely message is zero.
0

Subject: Computer Engineering

Topic: Electronic Circuits and Communication Fundamentals

Difficulty: Medium / High

ADD COMMENTlink
modified 10 months ago  • written 10 months ago by gravatar for Sayali Bagwe Sayali Bagwe2.2k
0

In case of the “extremely likely” message, there is only one single possible message make to be transmitted. Therefore its probability pk =1. The entropy of a most likely message make is :

$H = Pk \log_2⁡ \big(\frac{1}{Pk}\big)=1 \log_2⁡(1) \\ = \frac{\log_{10}⁡1}{\log_{10}⁡2} =0$

For an extremely unlikely message make k, its probability Pk →0

$∴H=Pk \log_2⁡(1/Pk)$

=0

Thus the average information or entropy of the most likely and most unlikely message is zero.

ADD COMMENTlink
written 10 months ago by gravatar for Sayali Bagwe Sayali Bagwe2.2k
Please log in to add an answer.