× Close
Join the Ques10 Community
Ques10 is a community of thousands of students, teachers, and academic experts, just like you.
Join them; it only takes a minute
Sign up
Question: Define entropy and explain types of entropy
0

Mumbai University >Information Technology> Sem 4 > Information Theory & Coding

Marks: 4 Marks

Year: May 2016

ADD COMMENTlink
modified 5 months ago  • written 6 months ago by gravatar for Veena Nandi Veena Nandi90
1

The most fundamental concept of information theory is the entropy. The entropy is defined as average amount of information per message. The entropy of a random variable X is defined by,

H(X) =-Σx P(x) log p(x)

H(X)≥ 0, entropy is always non-negative. H(X)=0 if X is deterministic.

  • Since Hb(X) = logb(a)Ha(X), we don‟t need to specify the base of the logarithm.
  • The entropy is non-negative. It is zero when the random variable is “certain” to be predicted. Entropy is defined using the Clausius inequality.
  • Entropy is defined in terms of probabilistic behavior of a source of information. In information theory the source output are discrete random variables that have a certain fixed finite alphabet with certain probabilities. Entropy is average information content for the given source symbol.
  • Entropy (example): Binary memory less source has symbols 0 and 1 which have probabilities p0 and p1 (1-p0). Count the entropy as a function of p0.
  • Entropy is measured in bits (the log is log2)

There are two types of Entropy:

  1. Joint Entropy
  2. Conditional Entropy

Joint Entropy:

Joint entropy is entropy of joint probability distribution, or a multi valued random variables. If X and Y are discrete random variables and f(x, y) is the value of their joint probability distribution of (x, y), then the joint entropy of X and Y is

H(X, Y)=-Σ x є X Σ y є Y f(x, y) log f(x, y)

The joint entropy represents the amount of information needed on average to specify the value of two discrete random variables.

Conditional Entropy:

Average condition self-information is called as the condition entropy. If X and Y are discrete random variables and f(x, y) and f (y |x) are the values of their joints and conditional probability distributions, then:

H (Y|X) =-ΣxєXΣyє Y f(x, y) log f(y |x) is the conditional entropy of Y given X.

  • The conditional entropy indicates how much extra information you still need to supply on average to communicate Y given that other party knows X.
  • Conditional Entropy is also called as equivocation. Conditional entropy is the amount of information in one random variable given we already know the other.
ADD COMMENTlink
written 6 months ago by gravatar for Veena Nandi Veena Nandi90
Please log in to add an answer.


Use of this site constitutes acceptance of our User Agreement and Privacy Policy.