The most fundamental concept of information theory is the entropy. The entropy is defined as
average amount of information per message. **The entropy of a random variable X is defined by,**

H(X) =-Σ_{x} P(x) log p(x)

H(X)≥ 0, entropy is always non-negative. H(X)=0 if X is deterministic.

- Since H
_{b}(X) = log_{b}(a)H_{a}(X), we don‟t need to specify the base of the logarithm. - The entropy is non-negative. It is zero when the random variable is “certain” to be predicted. Entropy is defined using the Clausius inequality.
- Entropy is defined in terms of probabilistic behavior of a source of information. In information theory the source output are discrete random variables that have a certain fixed finite alphabet with certain probabilities. Entropy is average information content for the given source symbol.
- Entropy (example): Binary memory less source has symbols 0 and 1 which have probabilities p0 and p1 (1-p0). Count the entropy as a function of p0.
- Entropy is measured in bits (the log is log2)

**There are two types of Entropy:**

- Joint Entropy
- Conditional Entropy

**Joint Entropy:**

Joint entropy is entropy of joint probability distribution, or a multi valued random variables. If X and Y are discrete random variables and f(x, y) is the value of their joint probability distribution of (x, y), then the joint entropy of X and Y is

H(X, Y)=-Σ_{ x є X } Σ _{y є Y} f(x, y) log f(x, y)

The joint entropy represents the amount of information needed on average to specify the value of two discrete random variables.

**Conditional Entropy:**

Average condition self-information is called as the condition entropy. If X and Y are discrete random variables and f(x, y) and f (y |x) are the values of their joints and conditional probability distributions, then:

H (Y|X) =-Σ_{xєX}Σ_{yє Y} f(x, y) log f(y |x)
is the conditional entropy of Y given X.

- The conditional entropy indicates how much extra information you still need to supply on average to communicate Y given that other party knows X.
- Conditional Entropy is also called as equivocation. Conditional entropy is the amount of information in one random variable given we already know the other.

- Publications
- Advertise
- RSS
- Stats