Technical data

What does the unit mV/V mean?

The output signal of a force sensor or a Wheatstone bridge is dependent on the level of the supply voltage.

The higher the supply voltage, the higher the output signal

By specifying the output per volt supply voltage, you can compare the sensors themselves.

Sensors with strain gauges deliver approximately 0 mV / V in the unloaded condition and-depending on the sensor approximately 0.5 mV / V to 3,5mV / V at full load.

The measuring amplifiers are designed for this signal level: measuring amplifier with a supply voltage of 5V, for example, handle input voltages between -17,5mV to + 17,5mV

What is "input sensitivity"?

The technical datasheets of bridge amplifiers do not specify the amplification factor and the supply voltage.

Imagine a sensor with 2 mV/V bridge output (at 100% load), and an amplifier with 5V bridge supply:
2 mV / V x 5V supply voltage = 10mV
multiplied with a gain of 1000
10mV x 1000 
results in 10 V output voltage of the amplifier

Instead it is easear to define as follows
2 mV / V output signal of the sensor (at full load)
2 mV / V input sensitivity, the amplifier
fits -> 100% output of the amplifier at 100% load on the sensor.

What does "characteristic value" mean?

The characteristic value of a sensor is the change of the output signal at 100% load.

The term is defined in the VDI guideline 2638: output signal at rated force, decreased by the zero signal ...

The output signal is expressed in mV per volt bridge voltage (mV/V).

Typical characteristic values of sensors are 2 mV/V, or 1 mV/V, or 0.5 mV/V, partly also 3 mV/V. A sensor with sensitivity 2 mV/V (at nominal force) therefore provides 10mV output signal with an amplifier of 5V bridge supply voltage  (as differential voltage between + bridge output and - bridge output)

What is the difference between characteristic value and rated characteristic value?

The terms are defined in the VDI 2638 (for force sensors) and VDI 2639 (for torque sensors).
the characteristic value is the actual output of the sensor. So this is defined in the test report or in the calibration certificate.
The rated characteristic value (or rated sensitivity) is the "target" (theoretical value) for the characteristic value. The rated sensitivity is therefore defined in the generally technical data sheet for this sensor type (not in the protocol for a specific and real sensor).

What does "strain" mean?

Strain is a measure of the relative change in length Δl/l

The units of strain are: m/m,μm/m, ppm (parts per million), %, ‰

What does cut-off frequency mean?

A low-pass filter is usually used for measuring amplifiers with analog output. But also the electronic components, like e.g. Interference suppression or instrumentation amplifiers have the characteristic of a low-pass filter: Signal components of higher frequencies are amplified less than signal components of lower frequencies (the amplification bandwidth product of the instrumentation amplifier is responsible for this). 

As a rule, low-pass filters with cut-off frequencies are deliberately used well below the capabilities of an instrument amplifier. Limit frequency means: a signal with this frequency is only transmitted with approx. 70% compared to the expected amplitude. The "order" of the low-pass filter also describes how much the amplitude is reduced (damped) with increasing frequency. The order describes the steepness of the filter characteristic curve. Third order filters are common for amplifiers with analog output. For a third order filter, the attenuation is -20dB per decade: If the frequency increases by a factor of 10, the amplitude decreases by a factor of 1000.

The reason for using a filter is that the filter reduces the statistical dispersion of the signals: The noise amplitude is reduced if the "bandwidth", i.e. the range of frequencies that may pass through the filter undamped, is restricted. This improves the resolution of the measuring amplifier. The rule of thumb is: One tenth of the bandwidth reduces the noise amplitude by a factor of 3. This is especially true if the signal contains all the frequencies equally.

Usually, there are some outstanding freuence components within the noise: e.g. 50Hz for interferences from the network.

When selecting the low-pass filter, the cut-off frequency should be at least 5 times higher than the signal frequencies to be measured so that the measurement result is not distorted. The low pass filter should be as low as possible and as high as necessary.

https://www.me-systeme.de/en/technology-first/electronics/basics/filter

https://www.me-systeme.de/en/technology-first/electronics/basics/comparison

https://www.me-systeme.de/en/technology-first/electronics/gsv-6/aufloesung-gsv-6bt

What does data frequency mean?

Measuring amplifiers with interface convert the analog signal into a digital signal with the help of an analog/digital converter. This happens at fixed (often configurable) intervals. The analog/digital converter captures the analog signal of the so-called sampling frequency. Similar to analog filters, the scattering of the signals is reduced. This is done by using digital filters.

The simplest digital filter is the averaging filter. The use of a digital filter (usually) reduces the amount of data. The data made available via the interface (USB, RS232, UART, CANbus, Ethernet, etc...) are sent at fixed intervals with the data frequency. 

A very low data frequency often has a negative effect on a control loop. Therefore, the moving average is usually used. With the moving average, the number of data at the output remains equal to the number of data at the input of the filter. High frequencies are filtered, which can be most clearly seen with the so-called step response of the filter: A signal jump at the input is reproduced with a transient response time at the output.

The time constant of a mean value filter is the number of averages "m". times sampling interval "T" (with very good approximation from 100 averages). The cut-off frequency Fg of a filter is related to the time constant: Fg = 1/(2 pi m T). 

At an internal sampling frequency of e.g. 10.000Hz and a configured data frequency of e.g. 100Hz, the cut-off frequency is about 16Hz. Rule of thumb: The cut-off frequency is about 1/6 of the (set) data frequency.

https://www.me-systeme.de/en/technology-first/electronics/basics/filter

https://www.me-systeme.de/en/technology-first/electronics/basics/comparison

https://www.me-systeme.de/en/technology-first/electronics/gsv-6/aufloesung-gsv-6bt