0
2.0kviews
Independent Component Analysis
1 Answer
0
9views

Independent component analysis (ICA) aims to solve problem of signals separation from their linear mixture. ICA is a special case of blind source separation, when separation performed without the aid of information (or with very little information) about the source signals or the process of signal mixing.  Although blind source separation problem in general is underdetermined, the useful solution can be obtained under a certain assumptions.

ICA model assumes that there are  n independent signals s$_{i}$(t),i=1,2,....n and some mixing matrix A  :

$\begin{bmatrix} a_{1,1} & a_{1,2} & \dots & a_{1,n} \\ a_{2,1} & a_{2,2} & \dots & a_{2,n} \\ \vdots & \vdots & \ddots & \vdots \\ \\ a_{n,1} & a_{n,2} & \dots & a_{n,n} \end{bmatrix}$

We can observe only signals $x_{i}$(t)  that are represented by linear superposition of  $s_{i}(t)$:

$x_{i}(t)={\substack\sum_{i=1}^{n}}a_{i,j}s_{j}(t)$

The main difficulty lies in the fact that both   $a_{i,j}$ and s$_{j}(t)$  are unknown. To solve the problem ICA uses assumption that signals in mixture are statistically independent and has non-Gaussian probability distributions. 

In general, two random variables $y_{i}$  and $y_{j}$  are statistically independent if information about one variable says nothing about other variable. From mathematical point of view statistical independence means that 2-D probability density function p$(y_{i},y_{j}$)  is the product of 1-D probability density functions:

p($y_{i},y_{j})=p(y_{i})p(y_{j})$

For statistically independent signals the covariance matrix of odd functions f$(y_{i})$ and g $(y_{j}$) is diagonal: all mutual covariances are zeros:

E$[f(y_{i})g(y_{j})]-E[f(y_{i})]E[g(y_{j})]=0$

and all self covariances are non-zeros:

E$[f(y_{i})g(y_{j})]-E[f(y_{i})]E[g(y_{j})] \neq 0$

Please log in to add an answer.