Sensors::Accuracy

In metrology, accuracy is defined as the degree of agreement between the displayed (measured) and "correct" value.

In the International Dictionary of Metrology (VIM) the accuracy is defined as:

"Extent of approximation of a measured value to a true value of a measurand"

A measuring device (a sensor, a display device) is considered to be accurate if it has a high measuring precision and a high measuring correctness.

The "measurement acuracy" is not a size and is not expressed quantitatively. A measurement is "more accurate" if it has a smaller measurement error.

The "measuring correctness" is also no size. With a high degree of measurement correctness, systematic errors and absolute deviations are small.

The "measurement precision" describes the "degree of agreement of displays or measured values ​​obtained by repeated measurements on the same or similar objects under given conditions" (VIM, dictionary of metrology).

 

Measurement uncertainty

The measurement uncertainty describes the spread of the measured values. The measurement uncertainty may be explained e.g. by a standard deviation (or by multiples of the standard deviation). It generally includes also systematic errors, e.g. the deviation from normals. The determination method A for the measurement uncertainty uses statistical methods which are carried out with values ​​under certain "repeat conditions" (for example by repeated measurements on the same object, with the same operator, at the same location, etc.).

All (statistical) components that can not to be assigned to the determination method A are assigned to the determination method B. These are based on information, e.g. on experience, on technical data of a calibration certificate, on the accuracy class of a tested measuring device, on drift, etc.

The "standard measurement uncertainty" is a measurement uncertainty, which is determined as the standard deviation.

The relative standard uncertainty describes the standard deviation divided by the absolute value of the measured value and is usually given as a percentage.

 

Accuracy class

according to VIM, Dictionary of Metrology:

" Class of measuring instruments or measuring systems that meet specified metrological requirements designed to ensure that the measurement errors or equipment uncertainties remain within specified limits under specified operating conditions."

The accuracy class is generally identified by a (positive) number, or by a character or symbol.

The accuracy class thus serves to compare similar sensors, as a summary (and strongly simplifying) selection criterion.

For force and torque sensors, the following properties are used for classification into an accuracy class:

  • relative standard measurement uncertainty
    relative linearity deviation and hysteresis
    Temperature-related drift of the zero signal
    Temperature-related drift of the slope of the characteristic

Example load cell KM40

The load cell KM40 is specified in the data sheet with an accuracy class of 0.5%.

The relative standard measurement uncertainty is e.g. determined by the standard deviation, in particular if more than 10 measurements were carried out.

When calibrating a sensor, three series of measurements are usually performed, the force is being increased in 5 or 10 steps to determine the repeatability and the linearity deviation.

The repeatability or "span" brv is determined as the maximum difference of the output signals at the same force in the same installed positions, based on the reduced by the zero signal in the installed state average output signal. brv is a measure of comparability.

 

Fig. 1: Result of the calibration of a load cell KM40 5 kN

The data sheet of the load cell KM40 shows an accuracy class 0.5. In the present (representative) example, the span at 25% of the rated load is 0.16% of 1.25kN (from the actual value). Since the standard deviation can not be formed due to the small number of measured values, the amount of the difference between the maximum and minimum values ​​of the three measured values ​​is formed in the calibration protocol, related to the actual value and displayed in percent.

The KM40 force sensor can be classified in the accuracy class 0.2 due to the span of 0.16% for the load stage 25%.

Another criterion for the classification is the relative linearity deviation. At 0.04%, this is also significantly smaller than the accuracy class 0.2%. The relative linearity deviation describes the maximum deviation of a characteristic curve of a force transducer from the reference straight line, which is determined with increasing force, based on the full scale value used.

To determine the hysteresis, the calibration would be required for ascending and descending load. A special case of hysteresis is given with the zero-point return error (at 0% load). This is indicated in the present calibration record and is less than 0.00% (beginning and end of the measurement series). since the force sensor is made of high-strength spring steel, the hysteresis is usually responsible for a systematic error, e.g. the use of linear guides, inadequately ground support surfaces for the force sensor, storage of spring energy in accessories for the introduction of force, etc.

The temperature-induced drift of the slope depends on the properties of the spring steel (decrease of the modulus of elasticity with increasing temperature) and properties of the strain gauge (increase or decrease of the k-factor with increasing temperature). These properties are known as systematic influences and are compensated well below 0.2% / 10 ° C and therefore only have to be measured within the scope of a type approval or can even be derived from the technical data of the strain gauge.

For the classification of the force sensor in the accuracy class 0.5, the temperature-related drift of the characteristic value (the slope) should be less than 0.5% / 10 ° C.

The temperature-related drift of the zero signal must be measured and compensated individually for each sensor.

Fig. 2 shows the temperature-induced drift of the zero signal for a KM40 5kN sensor:

 

Fig. 2: Temperature-induced drift of the KM40 5kN SN18207149 between 20 ° C and 80 ° C

Fig. 3: Measurement of the temperature-induced drift of the zero point of the KM40 5kN.

For the classification of a force sensor in accuracy class 0.5, the temperature-induced drift of the zero signal over a temperture range of 10 ° C should be less than 0.5% of the characteristic value of the sensor.

With a characteristic value of 1 mV/V (FS, "Full Scale"), this means a maximum drift of 0.005 mV/V/10°C.

Fig. 2 and Fig. 3 show the drift per 60°C. With the present force sensor, the drift is thus 0.00838 mV/V/60°C = 0.0014 mV/V/10K = 0.14% FS/10K