The below given basic technical terms are often made use of to define the quality/quantity of readings that are displayed by specific instruments. We decipher it for you to fathom its significance and have clear understanding of what it conveys.
The mathematical difference between the upper and lower range values is described as the span of a particular dimension.
Range 0 to150N, therefore span equals 150N
Range 20 to 220 °F, therefore span 200 °F.
Range 20 to150 psi, therefore span 130 psi.
Elevated Zero Range:
In a given range when the measured variable or the measured signal is found to be zero, and yet is greater than the lower range value then this range is defined as an Elevated Zero Range.
Example: -25 to 50 psi.
Suppressed Zero Range
When the measured variable value during a reading is recorded as zero, but is then found to be lesser than the lower range value then such a range is a suppressed zero range.
Example: 20 to 100 psi.
Is another basic technical terms used while measuring quantity, size, weight, distance or capacity of a substance which is ideally compared to a designated standard. This is sometimes referred to as measurand, which is defined as, a quantity that is intended to be measured. Example: Temperature, Pressure, rate of flow.
When an electrical, mechanical, pneumatic or any other variable is applied to the input of a device which needs to be measured, then this variable becomes the analog of the Measured Variable that is usually produced by a transducer.
Example: In a thermocouple thermometer, the measured signal is an EMF which is the electrical analog of the temperature applied to the thermocouple.
In a flow meter, the measured signal may be a differential pressure which is the analog of the rate of flow through an orifice.
In an electric tachometer system, the measured signal may be a voltage which is the electric analog of the speed of rotation of the part coupled to the tachometer generator.
An output signal is a basic technical term that is a variable produced by a device, element or system upon receiving an input signal.
The accuracy of an instrument indicates the deviation of the reading from a known value that is deemed to be accurate and is typically expressed as:
1. Percentage of full scale reading (upper range value).
Example: A 100 Kpa pressure gauge having an accuracy of ± 1 % would be accurate within ± 1 Kpa over the entire range of the instrument. That is the measured variable accuracy would be ± 1 Kpa, over the entire range of the Instrument.
2. Percentage of span.
Example: If a pressure gauge has a span of 200 Kpa and Accuracy of ± 0.5%.
Then for a reading of 150 Kpa that is recorded, the true value of measurement will be in between 150± 0.5×200 =150±1 That is, the true value of measurement is in between 149 Kpa and 151 Kpa.
3. Percentage of actual reading. For a reading of 2 volts in a voltmeter, with an accuracy of ±2% the inaccuracy would be ± 0.04 volts.
Accuracy Vs Precision:
While the words accuracy and precision sound pretty much similar to each other as to what they mean, but the fact is there are fundamental differences between the two.
Precision (a basic technical term) refers to the degree of correctness, a quantity that can be determined. This determination of quantity could be due to the result of multiple independent measurements, which presumably would improve precision. But when we talk of precision we are not considering how the value of a quantity compares to a “known” or “established” value.
Accuracy another basic technical term, on the other hand, does make this comparison. The accuracy of a measurement refers to how perfectly a measured value compares with a “known” or “established” standard.
Precision – When you measure the length of an object with a ruler and are confident that the measured value, which is in meters, is correct to 3 decimal points. Then you can now confidently state that the measurement has a precision of 1mm.
Accuracy – When you measure an object in comparison to a standard, whose length is known absolutely (because it’s a standard). Then the difference between the measured value and the known correct value is the accuracy of your measurement.
Is the ability of an instrument to reproduce the same measurement each time the same set of conditions are applied. This may not imply that the measurement is correct, but rather that the measurement is the same each time. Basic technical term.
Example: The concept of accuracy and repeatability in measurements can be illustrated by the throw of darts.
a. Poor Repeatability means poor Accuracy.
b. Good Accuracy means good repeatability.
c. Good Repeatability does not necessarily mean good Accuracy.
d. Repeatability does not include Hysteresis.
Is the change of an instrument or transducer output per unit change of the measured quantity. A sensitive instrument will produce clearly visible changes in response to small variations in the measured quantity. Typically an instrument with higher sensitivity will also have better repeatability and higher accuracy.
The smallest increment of change in the measured value that can be easily determined from the instrument display scale.
In process instrumentation the range through which an input signal may be vary upon reversal of direction, without initiating an observable change in the output signal. Dead band is usually expressed in percentage of span.
An instrument is said to exhibit hysteresis when there is a difference in readings depending whether the process of measurement is approached from above or below. Hysteresis usually is a result of the inelastic quality of an element or device. In other words, it may be due to the result of mechanical friction, magnetic effects, elastic deformation, or thermal effects. Hysteresis is expressed in percentage of span. Dead band term is included in the hysteresis.