Accuracy is nice but many operators, like Carl, prefer precision.

Normally, people use the terms “accurate” and “precise” interchangeably, but in the scientific community they each have a unique meaning.

Accuracy is the closeness of a measurement result to a standard value, and precision is the closeness of multiple measurement results to each other under the same conditions.

An example of this is – when throwing darts at a bull’s eye, you manage to hit the center of the dart board with a great degree of accuracy on your first throw. The fact that the second dart is stuck in the wall and the other in your friend Carl’s leg is a testament to your inability to be precise (to repeat your performance). If you managed, on a subsequent turn, to get all three darts in Carl’s leg, you would be described as having the ability to throw precisely with absolutely no accuracy… I mean, if you weren’t aiming at Carl.

To apply this to the lab, let’s look at taking a chlorine reading in water. Because a titration method can be used with a minimum amount of chemical interferences, it is said to be an accurate way to measure chlorine. However, many titrators use an analog needle movement that is difficult to read, and the rate in which the reagent is added to the sample is not metered. It is unlikely that two operators would add reagent in exactly the same way, or see the needle in exactly the same place.

A colorimeter, which can have some accuracy issues because of chemical interferences, tends to be very precise from user to user. The method is performed the same way by each user, and the digital display removes the need for interpretation.

Often you will find this trade-off between accuracy and precision exists when selecting methods or equipment to be used in the lab. In the example of chlorine, Carl, a water plant operator, may decide not to use titration because he is more worried about consistent readings from shift to shift than having an accuracy better than 0.05mg/L. (And he’s really tired of getting hit by darts.)