Accuracy and precision
In the fields of science, engineering, industry and statistics, the accuracy and precision of numbers have special meanings. The accuracy of a measurement system is the degree of closeness of measurements of a quantity to that quantity's actual (true) value. In contrast, the precision of a measurement system (also called "reproducibility" or "repeatability") is the degree to which repeated measurements under unchanged conditions show the same results. Although the two words can be synonymous in colloquial use, they are deliberately contrasted in the context of the scientific method. The word "precision" also refers to how fine the measurement is made, as the resolution of the measurement, such as to the nearest meter, centimeter, or yard, foot, inch, or nanometer.
A measurement system can be accurate but not precise, precise but not accurate, neither, or both. For example, if an experiment contains a systematic error, then increasing the sample size generally increases precision but does not improve accuracy. The end result would be a consistent, yet inaccurate, set of results from the flawed experiment. Eliminating the systematic error improves accuracy but does not change precision.
A measurement system is designated valid if it is both accurate and precise. Related terms include bias (non-random or directed effects caused by a factor or factors unrelated to the independent variable) and error (random variability).
The terminology is also applied to indirect measurements—that is, values obtained by a computational procedure from observed data.
In addition to accuracy and precision, measurements may also have a measurement resolution, which is the smallest change in the underlying physical quantity that produces a response in the measurement.
In the case of full reproducibility, such as when rounding a number to a representable floating point number, the word precision has a meaning not related to reproducibility. For example, in the IEEE 754-2008 standard, it means the number of bits in the significand (number of digits in the amount), so it is used as a measure for the relative accuracy with which a number can be shown.
Related pages [change]
- JCGM 200:2008 International vocabulary of metrology — Basic and general concepts and associated terms (VIM)
- John Robert Taylor (1999). An Introduction to Error Analysis: The Study of Uncertainties in Physical Measurements. University Science Books. pp. 128–129. ISBN 0-935702-75-X. http://books.google.com/books?id=giFQcZub80oC&pg=PA128.
Other websites [change]
|The Simple English Wiktionary has a definition for: accuracy, and precision.|
- BIPM - Guides in metrology - Guide to the Expression of Uncertainty in Measurement (GUM) and International Vocabulary of Metrology (VIM)
- "Beyond NIST Traceability: What really creates accuracy" - Controlled Environments magazine
- Precision and Accuracy with Three Psychophysical Methods
- Guidelines for Evaluating and Expressing the Uncertainty of NIST Measurement Results, Appendix D.1: Terminology
- Accuracy and Precision