ME auf SENSOR+TEST 2019
Wir stellen aus!
25.06 - 27.06 2019 in Nürnberg.
Wir freuen uns auf Ihren Besuch in der Halle 1, Stand 331.
The resolution describes the value from which two adjacent measured values can be distinguished. So it defines the smallest possible difference that is clearly recognized.
For visual displays, the dictionary for metrology refers to the resolution as the "smallest difference that can be reasonably distinguished".
The term resolution is closely related to the concept of the threshold: the threshold is the maximum value that causes no detectable change in the display.
Resolution is not synonymous with accuracy.
A "meaningful distinction" of two measured values is e.g. then given when the human sees in the graphical display of the measured values a clear perceptible difference. This means that the difference is visible over a sufficiently perceptible period of time and over a sufficiently perceptible amplitude.
ME measuring systems defines as resolution the difference between maximum value and minimum value of the last 10 seconds or alternatively the last 100 measured values.
This is a strict definition of resolution. Alternatively, one could also use the rectifier mean value (RMS value) of the last 3 seconds, or also the last 30 measured values.
Transferred to a graphics screen, one could say: For the resolution 1x line width must be seen as a difference to be clearly perceptible.
Since the absolute resolution is usually very small compared to the measuring range (eg 1 / 10,000 to 1 / 100,000 of the measuring range), and because it depends on the measuring range of the sensor, we form a relative numerical value for the resolution in ME measuring systems: We refer the resolution to the measurement range and make the reciprocal of it (for better "readability").
Since the resolution is essentially determined by the measuring amplifier used, the resolution is related to the standard measuring range of 2.0 mV / V or 3.5 mV / V of the measuring amplifier.
The resolution is thus described by a numerical value. This numerical value describes how often the measuring range can be divided into bar widths. The higher the number of parts, the better ("higher") the resolution.
The software GSVmulti allows the display of the resolution on the basis of different definitions: Resolution as numerical value in "parts", resolution as absolute size "Peak values Max minus Min", resolution as noise amplitude in decibel, etc.
The relative resolution (relative to the measuring range) is essentially a quality feature of the measuring amplifier: The inherent noise of the first amplification stage is decisive for the resolution. The resolution of the analog-to-digital converter (the digitizing noise) is usually better (finer) than the noise amplitude with 16 or 24 bit technology.
This fact is also responsible for the fact that an exact adaptation of the gain to the measuring range of the analog / digital converter does not lead to a significant improvement in the resolution: a doubling of the gain also causes a doubling of the noise amplitude.
The bandwidth of the measurements essentially determines the noise amplitude. If the bandwidth is filtered by filters on e.g. 0 ... 10Hz limited, the noise amplitude is much lower than with a bandwidth of 0 ... 100Hz or even 0 ... 1kHz.
With a noise distributed uniformly over all frequency components (white noise), a 10-fold bandwidth results in a √10 ≈ 3-fold noise amplitude.
In addition to the "inherent noise" of the measuring amplifier, external influences determine the achievable resolution. In particular, the shielding of the sensor lines is an essential prerequisite for a high resolution.
Further influences are e.g.
Vibration, drafts or entry of heat reduce the resolution.
A higher supply voltage is often considered as a measure to reduce the noise, since an increase of the supply - and thus the bridge output signal - is associated with a reduction of the gain and thus a reduction of the inherent noise.
The temporally and spatially non-uniform self-heating of the strain gages within a sensor, however, leads to a thermal noise, which reduces the resolution. 2.5V to 5V have proven to be optimal supply voltages. High resistance of the strain gages leads to higher resistance noise and more interference due to the higher input impedance of the electrode.
A reduction in the bridge supply voltage can lead to better stability of the bridge circuit.
A higher gage-factor of the strain gauge does not necessarily lead to a higher resolution. For a high resolution is a good "self-compensation" and above all a similar temporal behavior regarding drift and creep effects required.
The following numbers clarify the requirements:
In fact, the maximum strain of most sensors is only 500μm / m, the resolution of the measuring amplifier GSV-8 is about 100000 parts and the measuring grid length is often only 1.6mm!