0
9.2kviews
List important specifications of ADC 0808
1 Answer
1
558views

Specification Of ADC

Analog Input Voltage Range It is the maximum allowable input–voltage range in which ADC will operate properly. Actually, it is the difference between the smallest and largest analog input voltages to use the full range of digital outputs. Typical values are 0 to 10 V, 0 to 12 V, ±5 V, ±10 V, and ± 12V.

Input Impedance The input impedance of ADC varies from 1 Kohm to 1 Mohm, depending on the type of ADC. Input capacitance of ADC is approximately some picofarads.

Quantization Error The full-scale range of analog input voltage is quantized for conversion to a finite number of steps. The error is a process of quantization called quantization error. Generally, the quantization error is specified as ½ LSB.

Accuracy The accuracy of an ADC depends on quantization error, digital system noise, gain error, offset error, and deviation from linearity, etc. Accuracy is determined from the sum of all types of errors. Typical values of accuracy are ±0.001%, ±0.01%, ±0.02%, and ±0.04% of full-scale value.

Resolution The resolution is defined by the ratio of reference voltage to number of output states. Actually, it is the smallest change in analog voltage for LSB. N Resolution = Reference voltage / (2–1) where N = number of bits of the ADC.

Conversion Time The conversion time of a medium-speed ADC is about 50 µs and for a high-speed ADC, the conversion time is about a few ns. Therefore conversion time varies from 50 µs to a few ns for slow/medium speed to a high-speed ADC. Format of Digital Output Generally, an ADC always uses any standard code namely unipolar binary, bipolar binary, offset binary, ones complement and twos complement, etc.

Temperature Stability Accuracy of an A/D converter depends on temperature variation. Typical temperature coefficients of error are 30 ppm/ºC

Please log in to add an answer.