Calibration Definitions and Terminology

Calibration Definitions and Terminology

Accuracy: The conformity of a measurement to an accepted standard value. Thus, 3.14159 is a more accurate statement of pi than is 3.14160. Accuracy includes traceability to NIST or some appropriate national or international standards organization. It also includes all other uncertainties and nonlinearities.

Calibration Accuracy: The sum of the uncertainties in the calibration procedure, including the uncertainties in the references, test instruments, transfers, etc. Calibration accuracy must be better than the stated accuracy or initial accuracy.

Initial Accuracy: Accuracy at the time of shipment. All Accuracy references in this catalog shall operationally be understood as the Initial Accuracy. Adjustment to Nominal: The maximum allowable difference between the actual value supplied with the standard and the nominal value.

Stability or Long-Term Accuracy: The measurement that will predict the worst case error for the period indicated, typically a year. To determine the worst case error after one year, the initial accuracy is added to the one year stability.

Transfer Accuracy: A comparison of two nearly equal measurements over a limited time and temperature. IET's HATS-LR and HATS-Y transfer standards may be used as described below to transfer accuracies over three decades. See below for a tutorial on the use of transfer standards.( Page 26, Page 27)

Short-Term Accuracy: The limit that errors will not exceed during a 24-hour period of continuous operation. Unless specified, no zeroing or adjustments of any kind are permitted. The transfer accuracy obtained with IET's transfer standards is a short term accuracy.

Test Conditions: These comprise the assumptions and facts describing the environment, instrument and sample to be measured. These will include temperature, relative humidity, power, frequency, etc. If a standard is used in other conditions, e.g. at a different voltage or temperature or power, then the temperature coefficient or power coefficient or voltage coefficient or other variation may be used to predict the value of the quality under the nonstandard conditions.

Resolution: The digital value represented by one bit in a display in a digital measure. For example, if one bit represents 1 mΩ, then resolution is 1 mΩ.

Precision: The degree of exactness with which a measurement or quantity is stated - e.g., 3.14159 is a more precise value of pi than 3.14.

Repeatability: The closeness of agreement among a number of consecutive measurements performed under the same operating conditions. Long-term and short-term repeatability are both important.


See the IET SRL Series for an example of how the above definitions apply. Note the chart below.


  • A 1Ω - SRL standard is specified with an Adjustment Range of ±2 ppm.
  • A particular unit may be supplied with a value of 1Ω - + 0.8 ppm. This value would be given with the unit.
  • This standard would be accurate to 1.000 000 8 - ±1 ppm, 1 ppm being the Calibration Accuracy. In the case of the SRL, the Initial Accuracy is specified by IET to be the same as the Calibration Accuracy.
  • For predicting the value with time, the Stability, typically ±2 ppm, would be added for one year.