Basics of Infrared Sensors

This section introduces the basics of infrared sensors, how to select them, and key points in using them.

What Are Infrared Sensors?

Digital infrared temperature sensor FT Series

Your cheek feels warm when you move your palm close to it because the cheek skin senses the infrared rays being emitted from your palm.
All objects emit infrared rays, and the higher the temperature of an object, the stronger the infrared rays it emits.
Radiation thermometers use these infrared rays to measure temperature.

What Are Infrared Rays?

Infrared rays are a type of light similar to visible light. This type of light is also referred to as IR light.
However, people cannot see them with the naked eye because their wavelength is longer (frequency is lower) than that of visible light.
The wavelength is approximately 0.7 to 400 μm.
Infrared rays were discovered by British astronomer Sir Frederick William Herschel in 1800.

Features of Radiation Thermometers

The use of infrared temperature sensors offers the following two advantages.

  • The temperature can be measured at high speed
  • The temperature can be measured in a non-contact manner

Infrared temperature sensors are effective for measuring the temperature of moving/rotating objects and objects whose surface temperature changes when coming into contact with a sensor (objects with small heat capacity).
On the other hand, they have some disadvantages, such as the inability to measure the temperature of gases or the inside of an object. It is also necessary to set the emissivity of infrared temperature sensors according to the target object.

Principle of Radiation Thermometers

Infrared rays radiated from an object are collected by a lens into a sensing element called a “thermopile.”
A thermopile is a sensing element that generates electrical signals according to the temperature after absorbing infrared rays emitted from an object and being warmed by the absorption.
These signals are amplified, and the emissivity is corrected to display the temperature.

  1. 1
    Thermocouple cold junction
  2. 2
    Thermocouple hot junction
  3. 3
    Infrared absorbing film (right: view from above)

As shown above, an infrared temperature sensor is configured with many thermocouples connected in series.
The hot junctions of the thermocouples are collected in the center, and the cold junctions are collected on the periphery.
Because the infrared rays collected by the lens only hit the hot junctions, only the hot junctions are heated.
The Seebeck effect generates a voltage difference between hot and cold junctions, thereby enabling temperature measurement.
(An infrared temperature sensor has a built-in thermistor to measure the temperature of cold junctions.)

What is Emissivity?

The amount of infrared rays emitted from objects differs, even if they have the same temperature, depending on their materials and surface conditions.
When measuring the temperature using an infrared temperature sensor, you need to correct the ratio of this emission according to the target object.
This ratio is called “emissivity.”
The “emissivity” is a constant that depends on each object and a perfect blackbody has an emissivity of “1,” while an object that completely reflects or is permeable to infrared rays (such as air), opposite to a blackbody, has an emissivity of “0.”
This means that the emissivity of all objects falls between 0 and 1.

Emissivity table (typical)

Object Emissivity (typical)
Object
Concrete
Emissivity (typical)
0.94
Object
Sand
Emissivity (typical)
0.9
Object
Clay
Emissivity (typical)
0.85~0.90
Object
Brick
Emissivity (typical)
0.75~0.95
Object
Plaster
Emissivity (typical)
0.80~0.90
Object
Glass
Emissivity (typical)
0.75~0.95
Object
Rubber
Emissivity (typical)
0.86~0.95
Object
Wood
Emissivity (typical)
0.50~0.80
Object
Paper
Emissivity (typical)
0.70~0.94
Object
Plastic
Emissivity (typical)
0.60~0.85
Object
Water
Emissivity (typical)
0.92~0.98
Object
Skin
Emissivity (typical)
0.98

When light is incident on an object's surface, the energy is absorbed by the object, reflected by the surface, or penetrates the object.
Assuming that the incident energy is “1,” the following formula is established.
1 = absorptivity + reflectance + permeability

Also, according to Kirchhoff’s Law, the absorbed energy is equal to the energy to be radiated from the object. Hence the following formula is established.

Absorptivity = Emissivity
From the above formula, you can see that the higher absorptivity (or the lesser reflectance or permeability) an object has for incident energy, the higher the emissivity.

What is a Blackbody?

When it comes to emissivity, you need to understand what a “blackbody” is.
A “blackbody” absorbs all the light, irrespective of the wavelength, incident on the surface, with no reflection or permeation. Hence, a blackbody is an ideal object for a radiation thermometer.
Since its reflectance and permeability are both “0,” its absorptivity is “1,” and thus the emissivity is also “1.”

How to Determine the Emissivity

When the emissivity is known
If the emissivity of an object is described as a physical constant in any reference or other material, use the value as it is.
Determine the emissivity by taking into account the conditions where the emissivity has been measured (such as the surface condition of the object).

When the emissivity is not known
When emissivity isn’t known, you can measure the temperature of an actual object and use the value displayed on the radiation thermometer.

  • Method using a contact-type thermometer
    Measure the temperature of an object using two contact-type thermometers, such as a radiation thermometer and a thermocouple, and set the emissivity so that the values displayed on them will match.
  • Method using blackbody spray (tape)
    This spray is used to obtain the emissivity of an object.

Step 1

Apply blackbody spray to part of the object.

Step 2

Using an infrared temperature sensor set at the emissivity of the blackbody, measure the temperature of the part applied with blackbody spray.

Step 3

Measure the temperature of the part not applied with blackbody spray and set the emissivity so that it will correspond to the value displayed in Step 2.

Step 4

Determine the emissivity set in step 3 as the emissivity of this object.

How to Select Radiation Thermometers

Selection Based on the Usage Method

Infrared temperature sensors are generally classified into the following two types.

Hand-Held Type

An infrared temperature sensor in which a detector and a converter are not separated but integrated. Thanks to its compact size and lightweight, you can carry it and measure the temperature with it in your hand.

Digital infrared temperature sensor FT Series

Installation Type

An infrared temperature sensor in which a detector and a converter are structured separately and are coupled electrically with a connection cable. The thermometer is fixed in place while the temperature is measured.

Selection Based on Object Size and Measuring Distance

An infrared temperature sensor has a specific, measurable surface area (called the spot diameter) and measuring distance.
To measure the temperature accurately, it is necessary to use the designated spot diameter and measuring distance.

The above figure shows the relationship between the spot diameter and measuring distance of a radiation thermometer.
When selecting an infrared temperature sensor, make sure that the spot diameter is smaller than the target object.

Key Points in Using Radiation Thermometers

Emissivity Setting

Measurement errors occur when the setting of the emissivity is different from the emissivity specific to the target object.
Since the relationship between the emissivity and the object temperature is not linear, temperatures measured under different conditions cannot be modified or corrected at later times.
(A 1% difference in the emissivity setting does not necessarily generate a 1% difference in the temperature.)

Relationship between Emissivity Setting Errors and Temperature Measurement Errors (Typical Examples)

Object temperature Emissivity setting error (°C/°F)
1%
Emissivity setting error (°C/°F)
5%
Emissivity setting error (°C/°F)
10%
Object temperature
0°C (32°F)
1%
0.5°C (0.9°F)
5%
1.5°C (2.7°F)
10%
2.5°C (4.5°F)
Object temperature
100°C (212°F)
1%
0.6°C (1.08°F)
5%
3.0°C (5.4°F)
10%
6.0°C (10.8°F)
Object temperature
200°C (392°F)
1%
1.5°C (2.7°F)
5%
6.5°C (1.7°F)
10%
12.0°C (21.6°F)
Object temperature
300°C (572°F)
1%
2.0°C (3.6°F)
5%
9.5°C (17.1°F)
10%
18.0°C (32.4°F)

Spot Diameter and Target Object

To stably measure the temperature of an object, make sure that approximately 1.5 times the spot diameter fits inside the object.

Error is generated because some area outside the spot diameter is also detected.

The average of the object and the background is detected.

During High-Temperature Measurement

When measuring a hot object, the infrared rays emitted from the object heat the radiation thermometer body, which not only prevents the accurate display of the temperature but may also damage the thermometer in the worst scenario. In such a case, shield the infrared rays unnecessary for measurement as shown below.

  1. 1
    Radiation thermometer
  2. 2
    Spot diameter
  3. 3
    Shield plate (aluminum,etc.)
  4. 4
    Target object (hot)

Wiring to an Instrument (Recorder)

Measurement of 4 to 20 mA Output

The Measurement Method Using an Instrument Equipped with a 4 to 20 mA Input

Satisfy the relationship of “maximum load resistance of the 4 to 20 mA output > load resistance of the 4 to 20 mA input.”
If this relationship is not satisfied, a measurement error will occur.

Measurement Method Using a Shunt Resistor to Convert Current to Voltage

The current that flows through a shunt resistor is converted to voltage through Ohm’s Law (E = I × R).
The converted voltage can be measured with an instrument that has a voltage input range.

Satisfy the relationship of “maximum load resistance of the 4 to 20 mA output > resistance value of the shunt resistor.”
If this relationship is not satisfied, a measurement error will occur.

Method Using a Signal Converter

By using a signal converter, you can measure 4 to 20 mA output with an instrument that has a voltage input range.

Is It Possible to Wire a 4 to 20 mA Output in Parallel?

Yes, it is possible.

The Measurement Method Using Voltage Input

When the 4 to 20 mA target output device is connected to another 4 to 20 mA input device, measurement can directly be performed using an instrument that has a voltage range.
The voltage converted from current through the load resistance of the other 4 to 20 mA input device is measured.

Method Using an Instrument Equipped with a 4 to 20 mA Input

Simultaneous measurement is possible through wiring in series.

It is necessary to satisfy the relationship of “maximum load resistance of the 4 to 20 mA output > total load resistance of the two 4 to 20 mA inputs.” Also, note that a potential difference occurs between the − terminals of each input because load resistances are connected in series. Even if there is a potential difference, make sure that there are no problems in the circuit.

Measurement of Analog Voltage Output

Measurement is possible through direct connection.
Adjust the input range according to the output voltage.

How Does an Infrared Temperature Sensor Work?

In simplest terms, infrared temperature sensors work by detecting and measuring the infrared radiation emitted by an object. All objects whose temperature is above absolute zero (0°K or -273.15°C) radiate infrared energy as a function of their temperature—a principle known as black body radiation. As a direct function of energy, the amount of infrared radiation emitted by the body increases along with the object's temperature.

An infrared sensor for temperature uses an optical system to collect infrared radiation from the object whose temperature is being measured and focuses it onto a detector circuit. This system often includes microlenses and mirrors and frequently filters that focus the radiation further.

The lenses and mirrors focus the infrared energy onto an Infrared detector, which is usually a thermopile that absorbs the infrared radiation. The absorbed heat then triggers the Seebeck effects within the thermopile, generating a voltage across the thermopile junctions. If your sensor uses a pyroelectric detector, the temperature change leads to a change in polarization, generating an electrical charge.

The generated voltage is very small and must be amplified by the electronic circuit within the sensor. In most cases, the sensor also converts the analog signal into a digital one before sending it to the measuring unit.

What Instrument Is Used to Measure Infrared Radiation?

Infrared temperature sensors are input transducers that capture the physical properties of one object and convert it into an electrical signal. In most cases, the signal is converted from analog into digital and even amplified within the sensor itself.

Once conditioned, the signal is relayed to a measuring unit, which receives the analog/digital signal from the sensor and interprets it as a temperature reading. The interpretation mostly depends on the type of the measuring unit and the sensor. Some units are designed to be used with a single sensor type, while others might work with voltage signals, resistance, or even current.

Once the signal is interpreted as a temperature reading, it’s sent to the computing unit for further processing and storage, which enables further analysis if necessary. Furthermore, these devices can be programmed to trigger certain operations and processes based on their reading, which makes them particularly useful for in-line operations in manufacturing facilities.

How Accurate Are Infrared Sensors for Temperature?

The accuracy of Infrared sensors widely depends on several factors and the quality of the sensor itself. These factors include the emissivity of the object being measured, the distance-to-target ratio, atmospheric conditions, ambient temperature, calibration, and wavelength. In most common applications, Infrared sensor accuracy ranges from ±0.3°C to ±5°C.

However, industrial applications often demand higher accuracy, and high-quality, industrial-grade sensors that have been adequately calibrated can achieve significantly higher accuracies of ±0.1°C.

In conclusion, it’s really important to select the appropriate sensor for your application, especially in terms of temperature range. With regular maintenance of your measuring equipment and consistent re-calibration—yes, infrared temperature sensors can drift over time—these sensors can achieve accuracies that could potentially rival those of resistance temperature detectors.