Last Updated on

Precision vs. Accuracy 

Obtaining reliable measurements from 3D scanning is not as simple as it may appear. There are several factors at play in this aspect, including the 3D scanner’s precision and accuracy capabilities. In this article, we’ll discuss how the two terms differ, what can alter them, and when and how you should correct them when they are not up to par. 

What is the Difference Between Precision & Accuracy?

Precision and accuracy are two metrology terms that are often confused. This is because they both refer to the error of recorded measurements. Understanding the difference between the two is a crucial step to obtaining the desired data from a 3D scan of your part. 

The term of precision is used to describe how close a set of measurements taken by the same scanner and operator is in proximity to each other. If the measurements are grouped around the same number, the results are considered to be precise. Essentially, this means that the measurement results are repeatable. 

As for accuracy, this word is used to refer to a measurement’s proximity to the true or accepted value being measured. For example, if you were scanning a 1 inch block, the closer your scanner’s measurement is to 1.00”, the more accurate it would be considered. 

It is important to note that while precision and accuracy are closely related terms, they are independent of each other. This can best be explained through the classic example of darts on a dartboard where the bulls-eye/center is the true value: 

  1. No Precision, No Accuracy: The darts are scattered around the board and are neither close to each other, nor close to the center of the target. 
  2. High Precision, No Accuracy: The darts are grouped together in one area of the board, but they are not close to the center of the target.