When weighing a sample with a balance scale, the goal is to get as accurate a reading as possible. Several things go into determining how accurate a read has to be in order to be acceptable and whether or not a specific balance scale will do the job.
When we’re determining whether a given balance scale will be sufficient, we’re looking at two things: readability and maximum relative measurement uncertainty tolerance.
The first simply determines the lowest level by which a weight can be read. Readability isn’t the same as accuracy, but the two are related. Generally, smaller readability leads to higher degrees of accuracy, since you can be more accurate with a 0.1mg balance than a 1mg balance.
The second is a concept that defines your accepted level of uncertainty. The smaller the uncertainty tolerance, the more accurate the results will be. This is found by taking the standard deviation of the repeatability (Rstd) and typically incorporating a safety factor of (2 x Rstd).
You can take the required accuracy and perform a calculation to see if the balance scale you are using has the proper readability for the minimum weight. Let’s say that you are weighing a 50mg sample. Your maximum uncertainty tolerance is 0.1%, and the balance’s readability is 0.1mg.
The calculation looks like this: 2 x 0.1mg / 0.001, which equals 200mg. 200mg is higher than 50mg, so your 0.1mg readability isn’t sufficient for that sample. You would need a smaller readability of 0.01mg to get the stated accuracy requirements for the sample.
It’s important to evaluate balance scales by readability and the accuracy requirements of your measurements. Having the wrong balance scale for the job will result in inaccurate readings, which are undesirable for any project. And when you know you have the right scale for the job, scale calibration also factors into achieving higher degrees of accuracy. Use the above equation to determine what readability is acceptable, and find a balance scale that can achieve that result.