One of the most common measurements in the process industry is temperature.
The temperature sensor is the first component in every temperature measurement loop. It all begins with a temperature sensor. The temperature sensor is critical to the overall accuracy of the temperature measurement loop.
The temperature sensor, like any other measurement instrument, must be calibrated on a regular basis if it is to be accurate. Why would you measure temperature if you aren’t concerned with precision?
What is a temperature sensor?
Temperature sensors come in a variety of shapes and sizes, with varying output signals. Some have a resistance output, others a voltage signal, still others a digital signal, and so on.
In practice, the signal from a temperature sensor is typically connected to a temperature transmitter, which converts the signal into a format that is easier to transfer over longer distances to the control system (DCS, SCADA). For decades, the standard 4 to 20 mA signal has been used because a current signal can be transmitted over longer distances and the current does not change even if the wires are subjected to resistance.
Nowadays, digital signals or even wireless signals are used as transmitters.
Measuring the temperature sensor output
Because most temperature sensors produce an electrical output, that output must be measured in some way. To measure the output, resistance, or voltage, for example, you will need a measurement device.
The measurement device frequently shows an electrical quantity (resistance, voltage) rather than temperature. As a result, you must understand how to convert that electrical signal into a temperature value.
Most standard temperature sensors comply with international standards that outline how to calculate the electrical/temperature conversion using a table or a formula. If you have a non-standard sensor, you may need to contact the sensor manufacturer.
There are measuring devices that can directly display the temperature sensor signal as temperature. These devices also measure the electrical signal (resistance, voltage) and have sensor tables (or polynomials/formulas) built in to convert it to temperature. Temperature calibrators, for example, typically support the most commonly used RTD (resistance temperature detector) and thermocouple (T/C) sensors in the process industry.
So how to calibrate a temperature sensor?
Before we get into the specifics of calibrating a temperature sensor, let’s first look at the general concept.
To begin, because the temperature sensor measures temperature, you will need a known temperature to calibrate it in. It is not possible to “simulate” temperature; instead, a real temperature must be created using a temperature source.
You can either generate an accurate temperature or measure the generated temperature with a calibrated reference temperature sensor. For example, you could place the reference sensor and the sensor to be calibrated in a liquid bath (preferably one that is stirred) and perform calibration at that temperature. A so-called dry-block temperature source can also be used.
A stirred ice bath, for example, provides fairly good accuracy for the 0 °C (32°F) point calibration.
Temperature baths or dry-blocks are commonly used for industrial and professional calibration. These can be programmed to heat or cool to a specific temperature.
It is common practice in some industrial applications to replace temperature sensors at regular intervals rather than calibrate the sensors on a regular basis.
How to calibrate temperature sensors – things to consider
Let’s get started with the actual calibration of temperature sensors and the various factors to consider.
1 - Handling temperature sensor
Different sensors have various mechanical structures and mechanical robustness.
The most precise SPRT (standard platinum resistance thermometer) sensors, which are used as reference sensors in temperature laboratories, are extremely delicate. According to our temperature calibration laboratory personnel, if an SPRT makes a sound when it touches something, the sensor must be checked before further use.
Fortunately, most industrial temperature sensors are tough and will withstand normal handling. Some industrial sensors are built to be very robust and can withstand a lot of abuse.
However, if you are unsure about the structure of the sensor to calibrate, it is better to be safe than sorry.
It’s never a bad idea to treat any sensor as if it were an SPRT.
Aside from mechanical shocks, a rapid change in temperature can jam the sensor and damage it or impair its accuracy.
RTD probes are typically more sensitive than thermocouples.
2 - Preparations
There are usually not many preparations, but there are a few things to consider. First, a visual inspection is performed to ensure that the sensor is in good condition and has not been bent or damaged, and that the wires are in good condition.
External contamination can be a problem, so it’s important to know where the sensor was used and what kind of media it was measuring. You may need to clean the sensor before calibrating it, especially if you intend to calibrate it in a liquid bath.
Prior to calibration, the insulation resistance of an RTD sensor can be measured. This ensures that the sensor is not damaged and that the insulation between the sensor and the chassis is adequate. A decrease in insulation resistance can cause measurement errors and is a sign of sensor damage.
3 - Temperature source
As previously stated, a temperature source is required to calibrate a temperature sensor. Temperature simulation is simply not possible.
A temperature dry-block is the most commonly used industrial tool. It is convenient and portable, and it is usually accurate enough.
A liquid bath can be used for greater precision. That is not typically portable, but it can be used in a laboratory setting.
A stirred ice bath is frequently used to achieve zero degrees Celsius. It is simple and inexpensive, but it provides good accuracy for the zero point.
Fixed-point cells are used for the most precise temperatures. These are extremely accurate, but also extremely expensive. These are typically found in precise (and accredited) temperature calibration laboratories.
4 - Reference temperature sensor
Some of the heat sources mentioned in the previous chapter are used to generate the temperature. You obviously need to know the temperature of the heat source with pinpoint accuracy. Temperature is measured by an internal reference sensor in dry blocks and liquid baths. However, for more accurate results, a separate accurate reference temperature sensor that is inserted in the same temperature as the sensor(s) to be calibrated should be used. This type of reference sensor will more accurately measure the temperature being measured by the sensor to be calibrated.
The reference sensor should, of course, have a valid traceable calibration. It is simpler to send a reference sensor for calibration rather than the entire temperature source (it is good also to keep in mind the temperature gradient of the temperature block if you always only have the reference sensor calibrated not the block).
In terms of thermodynamic properties, the reference sensor should be as similar as possible to the sensor to be calibrated in order for them to behave similarly during temperature changes.
The reference sensor and the sensor to be calibrated should be immersed in the temperature source at the same depth. All sensors are typically immersed to the bottom of a dry-block. It becomes more difficult with very short sensors because they will only immerse a limited depth into the temperature source, and you must ensure that your reference sensor is immersed equally deep. In some cases, a dedicated short reference sensor is required.
You don’t need a reference sensor when using fixed-point cells because the temperature is based on physical phenomena and is very accurate by nature.
5 - Measuring the temperature sensor output signal
The electrical output of most temperature sensors (resistance or voltage) must be measured and converted to temperature. As a result, you’ll need some kind of measuring device. Some temperature sources include measurement channels for both the device under test (DUT) and the reference sensor.
When measuring electrical output, you must convert it to temperature using international standards. In most industrial settings, you’ll use a measurement device that can convert the signal for you so you can see it in the temperature unit (Centigrade or Fahrenheit).
Whatever method you use for measurement, make sure you understand the device’s accuracy and uncertainty and that it has valid traceable calibration.