Thinking about Resolution and Accuracy

For sensors, resolution corresponds to sensitivity, which represents the limit of measurement fineness (the discrimination threshold). On the other hand, accuracy represents precision, which is different from resolution in the field of measurement. This section explains the difference between resolution and accuracy, which are often confused, and how to obtain them.

Clarify what you require for the accuracy, the “minimum display digit” (the minimum digit you want to read) or the “absolute values.” Then, determine the required specifications—namely, the display resolution for the “display digit” or the absolute accuracy for the “absolute values”—in reference to the measurement accuracy. Although the “A/D resolution” is often regarded as being the same as the accuracy, note that the A/D resolution is used merely as a reference unless the accuracy is represented in digits or with the LSB (least significant bit).

Generally, there is a trade-off between the sampling speed and the resolution.

DSOs (digital storage oscilloscopes) typically have an 8-bit resolution (256 divisions). In a measurement range of ±10 V (20 V at full scale), the resolution is 20/256 = approximately 78 mV. Waveform recording systems typically have a 12- to 14-bit resolution (4096 to 16384 divisions). In a measurement range of ±10 V (20 V at full scale), the resolution is 20/16384 = approximately 1.2 mV. Hence, the poorer the resolution, the lower the measurement accuracy, so you need to make a selection according to your measurement target.