We have a transducer which is rated for 0-2000 psi, and it outputs 0-5 Volts. The specification document states that the accuracy is a combined LHR of +/- 0.5 %FS, how would we check this?
Measurement & Instrumentation Guides
Guidance and reference articles for sourcing, setting up and using measurement equipment.
High Accuracy
High accuracy pressure instruments with precise sensor technology & digital electronics for test, research & calibration applications.
What is the difference between zero offset and zero drift?
Zero Offset relates to the zero setting tolerance during manufacture and Zero Drift relates to the expected maximum change in zero over time.
Hysteresis
Hysteresis is the difference of two separate measurements taken at the same point, before and after a physical quantity is increased and decreased.
g Effect
The g Effect is a change in performance of a pressure measuring device that is caused by a change in its orientation. Typically a pressure sensing device will have some form of flat diaphragm which will generate a change in output when flexed by a change in pressure. If a pressure sensing diaphragm is orientated […]
LHR – Linearity, Hysteresis and Repeatability
Linearity, Hysteresis and Repeatability (LHR) is often used to describe the room temperature precision of a pressure sensor, excluding all zero & span offsets, temperature errors and long term stability. Also see non-linearity and pressure hysteresis.
PPM – Parts Per Million
Parts Per Million (PPM) is a ratio used to describe the maximum measurement error or resolution of pressure measurement equipment.
Secondary Pressure Standard
Secondary Pressure Standards are pressure calibration instruments which have to be checked by primary pressure standards on a regular basis.
Threshold
Threshold is the amount of measurement change required before a measuring instrument reacts with a change in measurement output or produces a specified result.
TEB – Total Error Band
Total Error Band (TEB) is the difference between the most negative and most positive deviation from the true measurement, determined from the combination of all known errors for a sensing device, within the constraints of the measurement and operating temperature range. Typically in the case of a pressure measurement device for example, the total error […]
TSS – Thermal Span or Sensitivity Shift
Thermal Span or Sensitivity Shift signifies the maximum amount of span that will change at any measurement point within the compensated temperature range.
TZS – Thermal Zero Shift
Thermal Zero Shift (TZS) is the maximum amount the output or reading at the null measurement point might deviate over the compensated temperature range
Thermal Hysteresis
Thermal Hysteresis is the measured change in output or reading at a specific measurement point taken during a sequence of increasing and decreasing temperature.
TSL – Terminal Straight Line
Terminal Straight Line (TSL) is a straight line drawn between the measurement output at zero and at full scale.
TEB – Temperature Error Band
Temperature Error Band (TEB) is the error derived from the most positive and negative deviation of all measurement points within a measurement range over the operating or compensated temperature range.
Temperature Error
Temperature Error is the deviation of a measurement reading caused by a change in media or environmental temperature.
Temperature Compensation
Temperature Compensation is a correction applied to a measurement instrument to reduce errors attributed to temperature changes in a process media which is being measured or in the surrounding environment that the instrument is being used.
Repeatability
Repeatability is the amount of change in a measured reading at the same measurement point after a defined number of cycles over the measurement range or a set of environmental limits.
RTE – Referred Temperature Error
Referred Temperature Error is the max deviation expressed as a +/- %FS from measurements taken at a defined temperature.
Accuracy
Accuracy of a measurement instrument defines how much a measurement value may deviate from the perfect measured value.
Precision
Precision is the measure of how closely a set of readings will be to a reference line that passes through the middle of all the points.
NL – Non-Linearity
Non-Linearity is the deviation error derived from the straightness of a set of recorded measurements when compared to a straight line, such as bsl or tsl, drawn through all the results. The maximum non-linearity error is normally expressed as a percentage of full scale.
Long Term Stability/Drift
Long Term Stability or Long Term Drift is the amount of change of a measured reading at exactly the same pressure and ambient conditions over a given period of time which is typically quoted as an annual figure.
Pressure Hysteresis
Pressure Hysteresis is the difference between two separate measurements taken at the same pressure but one where the pressure was increasing and the other where the pressure was decreasing. The hysteresis is caused by the natural reluctance of a pressure sensing material such as a diaphragm to return to its original position, shape or form […]
Digital Compensation
Digital compensation is the process of correcting measurement signals using look up tables or mathematical formula.
BSL – Best Straight Line
Beast straight line (BSL) is the mathematically derived straight line which runs through the middle of a set of readings in such a way to achieve the smallest error across all the results. The measurement precision of a device is often specified in relation to the maximum deviation from the best straight line.
Compensated Temperature Range
The compensated temperature range defines the limits of operation for a specified measurement accuracy. e.g. A pressure sensor has an accuracy of 0.25% full scale over a compensated temperature range of -20 to +80 degC. Since temperature errors are often significant for many measurement devices, a manufacturer will incorporate digital or analogue temperature compensation. If the measurement device is […]
Determining calibration error of Bourdon tube pressure gauge
How do you calculate maximum gauge error in pressure measurement with a bourdon gauge instrument?
Shunt resistor calibration explanation
What is the method of shunt calibration in relation to an indicator connected to a pressure transducer?
Measurement Accuracy
There are many contributing error factors which go into a total uncertainty calculation. The way accuracy is defined for pressure instruments on technical data sheets can vary significantly across manufacturers and product types.
Pressure Calibration
This guide will answer many of your questions about using & selecting pressure calibration equipment.
What affects the performance of low pressure sensors
By their nature low range pressure sensors are very sensitive instruments and there a few factors that need to be considered prior to and during installation.
How does the accuracy of pressure measurement devices change over time
Not even the most accurate pressure instruments will hold their accuracy indefinitely, all are prone to drift over time.
Pressure Sensor Accuracy Specifications
Understand what parameters are included in pressure sensor accuracy specifications and what techniques were used, so that a true comparison can be made.
Choosing calibrator for pressure transmitters
A guide on what to consider when choosing pressure calibration equipment for calibrating pressure transmitters.