**1 Answer**

written 7.4 years ago by | • modified 7.4 years ago |

The various important specifications of a converter is generally specified by the manufacturers are as follows:

## 1) Resolution :

The resolution of a converter is the smallest change in voltage which may be produced at the output (or input) of the converter. For example, an 8-bit D/A converter has $2^8$-1=255 equal intervals. Hence the smallest change in output voltage is (1/255) of the full scale output range.

Resolution should be high as possible. It depends on the number of bits in the digital input applied to DAC. Higher the number of bits, higher is the resolution.

It can also be defined as the ratio of change in analog output voltage resulting from a change of 1LSB at the digital input. For n-bit DAC, $$Resolution = \frac{V_{FS}}{2^n - 1}$$

## 2) Linearity :

The relation between the digital input and analog output should be linear. However practically it is not so due to error in the values of resistors used in resistive networks.

## 3) Accuracy :

- Absolute accuracy is the maximum deviation between the actual converter output and the ideal converter output.
- Relative accuracy is the maximum deviation after gain and offset errors have been removed.

## 4) Settling time :

- Settling time represents the time it takes for the output to settle within a specified band ±(1/2) LSB of its final value, after the change in digital input.
- It should be as small as possible.

## 5) Monotonicity :

A monotonic DAC is the one whose analog output increases for an increase in digital input. A monotonic characteristics is essential in control applications, otherwise oscillations can result. If a DAC has to be monotonic, the error should be less than ±(1/2) LSB at each output level.

## 6) Stability :

The performance of converter changes with temperature, age and power supply variations. So all the relevant parameters such as offset, gain, linearity error and monotonicity must be specified over the full temperature and power supply ranges.