Understanding the Difference Between Resolution and Accuracy
What is Resolution?
Humans use their eyes to observe objects in the surrounding environment. With 20/20 vision the human eye can detect or "resolve" an object approximately 0.1mm in size (about the thickness of a single sheet of paper). In the world of electronic equipment, the term "resolution" refers to the measurement of the finest detail that can be resolved by the sensors on a device. Just as humans use their eyes as sensors, instruments like an AC power source contain their own sensing tools to measure relevant information.
Let's use an RMS ammeter as an example. An AC power source that can measure RMS current down to 0.001A has a current resolution of 1mA. This is the smallest amount of current that the unit can resolve or "see." Human eyes detect light and send a signal to the brain to interpret an image. The AC power source contains a sensing resistor to detect the current and sends that information to the microprocessor so the current value can be displayed to the user. However, just as a person's vision can blur and mistake objects, each electronic sensor contains a certain amount of error in its ability to detect a value. This error determines the accuracy of the sensor.
Determining Accuracy
Determining accuracies for a specification is simply a matter of performing a few calculations. An important point to remember in any type of calculation is that numbers within parentheses are calculated first. In most specifications, the number within parentheses usually refers to the percentage of a setting, reading or range. Next, a multiple of the resolution is added to the percentage to account for a measure of uncertainty. Multiples are often referred to as "counts" of resolution. Counts are generally included in accuracy specs to account for errors in the lowest range of measurement. If the resolution on a current meter is 1mA and the accuracy specifies 5 counts, this means 5 counts = 5 * 1mA = 5mA.The calculated value is then added to the setting, reading or range to get a ceiling for the displayed value and subtracted to a get a floor for the displayed value. The floor and ceiling provides a range of numbers as possible values for the displayed value.
An Example
A typical specification for an AC power source's voltage meter is given below:
Voltage | Range | 0.0-400V |
Resolution | 0.1V | |
Accuracy | ±(1% of reading + 2 counts) |
The user has set the voltage to 277V and the source is indicating an output of 277V on the display. Following the procedure outlined above:
- Calculate the % of reading. The accuracy indicates 1% of reading: 277V * 0.01 = 2.77V. So at this point the accuracy can be displayed as ±(2.77V + 2 counts)
- Calculate the multiple of the resolution and add to the percentage: multiple of resolution = 2 counts = 2 * 0. 1V = 0.2V. Thus accuracy = ±(2.77V + 0.2V) = ±(2.97V)
- Value is first added to the base reading of 277V to get the high end of the range: 277V + 2.97V = 279.97V
- Value is then subtracted to the base reading of 277V to get the low end of the range: 277 - 2.97V = 274.03V
- This means that if the display is registering an output of 277V, it could actually be outputting anywhere from 274.03 - 279.9V.
If a laboratory testing procedure requires the application of 150V, the user may need to account for any dip in voltage. At 150V, the accuracy with the above specification would be ±1.7V. So the user may want to set the voltage on the source to 151.7V to compensate for the floor value on the accuracy calculation.
Conclusion
The specified accuracy for electronic equipment tells the user if a device is sufficient for his or her particular needs. Accuracy specifications are carefully reviewed by end users such as engineers and test developers because they tell the user how close a measured value is to the actual value. This information allows the user to account for any error that could occur in a system and thus provide the best possible results.