Sensor Technology: Cooled vs. Uncooled Systems

The performance of a thermal imaging system depends on how the sensor translates infrared energy into an electrical signal. Different internal mechanisms are used in thermal cameras to detect heat, varying between detectors (which change physical properties when heated) and quantum sensors (which interact directly with individual photons). 

Choosing the right technology is a balance between the required thermal sensitivity, the speed of the event being captured, and the operational environment of the camera. By examining the distinct architectures of microbolometers, cooled MWIR systems, and quantum detectors, we can understand how these engineering choices dictate a camera’s effectiveness in both high-speed research and long-term industrial monitoring.

Microbolometers

A microbolometer is a specific type of uncooled thermal sensor that functions as a tiny, grid-based thermometer. Each pixel in the array is a “bridge” of vanadium oxide (VOx) or amorphous silicon (a-Si) suspended over a silicon substrate. These bridges heat up when struck by infrared radiation, causing a measurable change in their electrical resistance. This change is then processed by the camera’s electronics to create a digital image. Because these sensors operate at ambient temperatures, they do not require bulky cooling hardware, making them the standard for portable industrial thermal imaging tools and permanent “set-it-and-forget-it” installations.

The primary physical constraint of a microbolometer is its thermal time constant. Because the sensor relies on the physical heating and cooling of the pixel material, there is a slight delay in its response time. For stationary objects or slow-moving processes, this is negligible. However, if the target is moving rapidly, the pixels cannot reset their temperature fast enough, leading to “smearing” or motion blur. This makes microbolometers ideal for thermography where the goal is to identify steady-state temperature differences rather than capturing high-speed transients.

Despite their slower response compared to quantum sensors, modern microbolometers have achieved remarkable sensitivity levels. Advances in vacuum packaging and pixel pitch (the distance between pixel centers) have allowed these sensors to detect temperature differences as small as 0.03°C (30mK). For the majority of research and industrial inspection tasks—such as finding a failing fuse or identifying heat loss in a building envelope—the microbolometer offers the best possible ratio of performance to cost and reliability.

Cooled IR (MWIR)

Cooled infrared systems operate on a fundamentally different principle by utilizing an integrated cryocooler to bring the sensor temperature down to approximately 77 Kelvin (-196°C). This extreme cooling is necessary to suppress “dark current,” which is the background electronic noise generated by the sensor’s own heat. By eliminating this noise, the camera can achieve a vastly higher signal-to-noise ratio, allowing it to “see” minute thermal signatures that would be completely buried in the noise floor of an uncooled system.

While most uncooled sensors operate in the Long-Wave Infrared (LWIR) band, cooled cameras are frequently designed for the Mid-Wave Infrared (MWIR) spectrum. The MWIR band is particularly useful for imaging through certain gases or for high-temperature applications like glass and metal manufacturing. Because the atmosphere has a specific transmission window in the Mid-Wave band, these cameras are also the gold standard for long-range surveillance and maritime applications where high humidity or salt spray might attenuate the signal of a standard LWIR camera.

The most significant advantage of a cooled MWIR system is its integration time, or electronic shutter speed. Unlike a microbolometer, which takes milliseconds to change temperature, a cooled quantum sensor can capture an image in microseconds. This allows researchers to “freeze” high-speed motion, such as a turbine blade spinning at thousands of RPMs or an explosion’s thermal expansion, without any motion blur. This speed, combined with superior sensitivity, makes cooled technology the only viable choice for advanced ballistics, aerospace testing, and gas leak detection.

Quantum Detectors

Quantum detectors represent the pinnacle of infrared sensing technology, utilizing materials like indium antimonide (InSb) or mercury cadmium telluride (MCT). Unlike bolometers, which are “thermal” detectors, these are “photo” detectors. They work by absorbing infrared photons and converting them directly into free electrons. This is a direct quantum interaction: an incoming photon kicks an electron into a higher energy state, creating an instantaneous electrical pulse. This process is nearly 100 times faster than the thermal response of an uncooled sensor.

These sensors are highly tunable and can be engineered to be sensitive to very specific, narrow slices of the infrared spectrum. This “spectral selectivity” is crucial for applications like optical gas imaging (OGI). By placing a cold filter in front of a quantum detector, a camera can be tuned to the exact absorption frequency of a gas like methane or carbon dioxide. The gas then appears as “smoke” on the camera screen, allowing for the detection of leaks that are invisible to the naked eye and undetectable by standard thermal imagers.

The trade-off for this extreme performance is complexity and maintenance. The cryocooler is a mechanical component with a finite lifespan (typically 10,000 to 15,000 hours of operation) and requires a few minutes to cool down before the camera can be used. Furthermore, quantum detectors are subject to a phenomenon called non-uniformity, where individual pixels may drift in sensitivity over time. This requires more frequent calibration and sophisticated image processing to maintain the high-fidelity radiometric data required for laboratory environments and precision scientific research