Looking for the Wndsn store? This way, please.

Accuracy vs. Precision

For measuring devices, accuracy is determined and limited by the precision with which physical markings can be created and reproduced, as well as read and aligned to the corresponding object to be measured.

The maximum achievable precision is ± half of the smallest division of the scales.

In the fields of science and engineering, the accuracy of a measurement system is the degree of closeness of measurements of a quantity to that quantity's true value. The precision of a measurement system, related to reproducibility and repeatability, is the degree to which repeated measurements under unchanged conditions show the same results.

Accuracy attempts a measurement as close as possible to a known value and is the degree to which a given quantity is correct and free from error.

Precision attempts multiple measurements to be as close to each other as possible and is the number of digits used to perform a given measurement.

"Precision is measured with respect to detail and accuracy is measured with respect to reality."

But:

"Accuracy is limited by the precision with which physical markings can be drawn, reproduced, viewed, and aligned."

And; precision is independent of accuracy:

"For example, if on average, your measurements for a given substance are close to the known value, but the measurements are far from each other, then you have accuracy without precision."[1]

While the accuracy of a number is given by the number of significant digits to the right of the decimal point, the precision is the total number of significant digits.[2]

Significant Digits

The number of significant digits (or significant figures) is the number of digits needed to express the number to within the uncertainty of calculation. For example, if a quantity is known to be 1.234 ± 0.001, four figures would be significant.[3]


Resources:

See also:

References:

  1. Accuracy and Precision 
  2. Accuracy 
  3. Significant Digits 

Shop Wndsn Telemeters