**1 Answer**

written 7.9 years ago by |

Nyquist criteria decides the minimum sampling rate. The nyquist rate is defined as the minimum sampling rate required to represent complete information about continuous signal f(t) in its sampled form, f*(t). therefore, according to sampling theorem, the nyquist rate is

$$f_{s\min}=2f_m$$

The maximum interval of sampling can be given as

$$T_{s\max}=\dfrac{1}{f_{s\min}}=\dfrac{1}{2f_m}$$

It is called Nyquist interval or Nyquist Criteria.

When $T_s=1/2 f_m$, this amounts to $2f_m$ samples per second. This is called Nyquist rate of sampling and $1/T_s=f_s=2f_m$ is called Nyquist frequency. In simple words, it means that the signal ust be sampled at least twice during each period of cycle of its highest frequency component.

The minimum sampling frequency $f_s$ equal to $2f_m$ cannot be achieved in practice because of the difficulty in realizing ideal filters. Practically we must use the sampling frequency which is more than twice the maximum frequency in the baseband waveform. How much ore is a matter that depends upon the low-pass filter characteristics and how faithfully the baseband waveform must be reproduced.