How does a sharp infrared image work?
Thermal imaging cameras help to make production processes quicker and safer, and to improve the quality of the end products. But how does the camera capture the image and how many pixels does it need for this process?
A lens is used to focus the infrared radiation of an object onto a sensor, and this generates an electric signal which is proportional to the radiation. The signal is amplified, and via the following digital signal processing is converted to an output size which corresponds to the object temperature. The measurement value can be shown on a display screen or represented as an analogue signal. The core component of a thermal imaging camera, the image sensor, is a 150-nanometer-thick focal plane array (FPA) which can have anywhere between 20,000 and one million pixels. The pixels themselves consist of microbolometers ranging in size from 17 × 17 to 35 × 35 µm² whose resistance value changes when they absorb thermal radiation. The change in resistance causes a change in the signal voltage which decreases across the bolometer's resistor and is subsequently analyzed.
In principle more pixels means more detail. Because the laws of physics also apply to thermal imaging cameras, sensors with a high megapixel count are subject to negative effects. As with digital cameras for photography, more and more pixels are now being housed on the same small surface. Because of this, each individual pixel has less and less space to capture thermal radiation. This means that the weak signals need to be amplified. However, this in turn increases the noise contained within the signal which gives rise to disruptive pixels and inaccuracies in the temperature measurement. This is counteracted by software-based noise reduction which retouches the captured image. The result is that fine image structures are also smoothed as well as the noise. Some of the higher resolution infrared cameras try to improve the richness of detail, either by interpolation or by overlaying different images which are produced by mechanical movements of the chip in the sub-pixel range.
The Overworked Pixel
As well as the noise, a second problem also occurs: comparable to a glass of water, smaller and smaller individual pixels can only absorb a certain amount of thermal radiation before they “overflow”. If an image region is mapped exactly during this “blooming”, other regions will have details which cannot be identified.
But the oft-neglected quality of the camera lens plays a decisive role. What use is a sensor with the maximum number of pixels if the lens cannot relay the infrared energy radiated from the measurement object to the image sensor in as loss-free a way as possible? If the individual pixel that can be fully resolved by the lens is bigger than the individual pixel of the FPA, more than one pixel at time will be exposed. This results in obvious blurring. It is only when taking the interaction between lens and sensors into account that the actual resolution can be found.
Every pixel needs time and storage capacity
The higher the resolution of a thermal imaging camera, the more that further unpleasant side effects arise alongside the qualitative impacts. The flood of data that is produced when recording with the camera needs to be processed before saving. Here, interfaces with limited live data transfer rates represent the first hurdle. The data transfer then takes a certain amount of time and causes the video function's sampling rate to slow down. The large amount of space that thermal images take up on the computer and the connected storage media should also be considered.
Correct usage is a similarly important topic. Thermal imaging cameras, just like normal digital cameras, are equipped with a field of view (FOV) which can cover angles of 6° for a telephoto lens, 26° for a standard lens and up to 90° for a wide-angle lens. The further you get from the object, the larger the captured image region and with it the image detail that an individual pixel can capture.
The optical resolution of the measuring device must be selected depending on the size of the measurement object and the distance between it and the sensor. In the chart on the left, because the measuring spot is too large, the thermal radiation of the considerably cooler circuit board has been included which results in a significantly distorted temperature measurement. For this reason the measuring spot of the camera must not be bigger than the size of the measurement object.
Because of this, for very small measurement objects or for large distances between the thermal imaging camera and the measurement object, high resolutions are vital. In a trial conducted by Optris, two different resolutions were used to measure the temperature of a wire at an identical distance and with identical environmental conditions. While a hotspot of 70.4°C was accurately detected at 640 x 480 pixels, the measurement at a resolution of 80 x 80 pixels gave a reading of only half that amount.
Follow us on
Follow us on
Follow us on