Back to Top

Accuracy vs. Precision

Our clients have come to expect that the machines we build for them will be both precise and accurate.  But what does that mean?  Can something be precise but not accurate?  Or accurate but not precise?  The short answer here is YES.

Accuracy is defined as the degree of exactness, and precision means the degree of reproducibility.

The most common analogy used to explain the difference between accuracy and precision is the series of targets shown below.  Measurements are depicted by arrows shot at the targets.  Accuracy is described as closeness of the arrows to the bullseye at the center of the target.  Arrows closer to the bullseye are considered more accurate.   The closer a system’s measurements are to the accepted value, the more accurate the system is considered to be.  Alternatively, precision is described by the size of a cluster of arrows if many were shot.  When all arrows are tightly grouped, the cluster is considered precise since they all hit close to the same spot (even if not necessarily the bullseye). 

The clustering of the arrows in the bullseye represents both accuracy and precision in measurements.

A measurement system is deemed valid if it is both accurate and precise.  In general, a precise machine can be calibrated to make it also be accurate, but not vice-versa.