Getting the Most From Your IR Camera

For best results, IR camera users must think carefully about the type of measurements they need to make and then be proactive in the camera’s calibration process. The first step is selecting a camera with the appropriate features and software for the application. And, an understand-ing of the differences between thermographic and radiometric measurements is very helpful in this regard.

Thermography is a type of infrared imaging in which IR cameras detect radiation in the electromagnetic spectrum with wavelengths from roughly 900 nm to 14,000 nm (0.9 µm to 14 µm) and produce images of that radiation. Typically, this imaging is used to measure temperature variations across an object or scene, which can be expressed in degrees Celsius, Fahrenheit, or Kelvin.

Radiometry is the measurement of radiant electromagnetic energy, especially that associated with the IR spectrum. It can be more simply defined as an absolute measurement of radiant flux.

The typical unit of measure for imaging radiometry is radiance, expressed in units of Watts/sr-cm2. The abbreviation sr stands for steradian, a nondimensional geometric ratio expressing the solid (conical) angle that encloses a portion of the surface of a sphere equivalent to the square of the radius.

In simple terms, you can think of thermography as how hot an object is while radiometry is how much energy the object emits. These two concepts are related but not the same thing.

IR cameras inherently measure irradiance, not temperature, but thermography does stem from radiance. When you thermographically calibrate an IR system, you are completing measurements based on effective blackbody radiance and temperature. Accordingly, the emissivity of the target object you are measuring is vital to achieving accurate temperatures. Emissivity or emittance is the radiative properties of an object relative to a perfect blackbody.

Entry-level IR cameras with microbolometer detectors operate according to nonquantum principles. The detectors respond to radiant energy in a way that causes a change of state in the bulk material, such as its resistance or capacitance. Calibration software in these cameras is oriented toward thermographic imaging and temperature measurements.

High-end IR cameras with photon detectors operate according to quantum physics principles. Although they also provide high-quality images, their software typically is more sophisticated, allowing accurate measurements of both radiance and temperature.

Key Physical Relationships in Camera Operation

There are five basic steps in producing radiometric and thermographic measurements with an IR camera system:

  • The target object has a certain energy signature that is collected by the IR camera through its lens.
  • This involves the collection of photons in the case of a photon detector or collection of heat energy with a thermal detector, such as a microbolometer.
  • The collected energy causes the detector to produce a signal voltage that results in a digital count through the system’s A/D converter; for example, a 14-bit dynamic range that creates count values ranging from 0 to 16,383 proportional to the incident energy.
  • When the camera is properly calibrated, digital counts are transformed into radiance values.
  • Finally, the calibrated camera electronics convert radiance values to temperature using the known or measured emissivity of the target object.

Every IR camera designed for serious measurements is calibrated at the factory by taking a number of blackbody measurements at known temperatures, radiance levels, emissivities, and distances. This creates a table of values based on the A/D counts from the temperature/radiance measurements entered into the calibration software and passed through an in-band radiance curve-fit algorithm. The result is a series of calibration curves, one of which is illustrated in
Figure 1.

Figure 1. System Response Signal for a Blackbody Radiator at a Particular Temperature

The calibration curves are stored in the camera system’s memory as a series of numeric curve-fit tables that relate radiance values to blackbody temperatures. When the system makes a measurement, it takes the digital value of the signal at a given moment, goes into the appropriate calibration table, and calculates temperature.

Ambient Drift Compensation
Another important consideration in the calibration process is how to deal with the radiation caused by the heating and cooling of the camera itself. Any swings in the camera’s internal temperature caused by changes in environment or the heating and cooling of camera electronics will affect the radiation intensity at the detector. The radiation that results directly from the camera is called parasitic radiation and can cause inaccuracies in camera measurement output, especially with thermographically calibrated cameras.

Some IR cameras have internal sensors that monitor changes in camera temperature. Correction factors for different temperatures are stored in the camera, which are used for real-time corrections.

Ultimately, the camera must calculate an object’s temperature based on its emission, reflected emission from ambient sources, and emission from the atmosphere using the Total Radiation Law. The total radiation power received by the camera can be expressed as:

Wtot = ?? Tobj + (1 – ?) ? Tamb + (1 – ?) Tatm

where: ? = the object emissivity
? = the transmission through the atmosphere
Tamb = the effective temperature of the object surroundings or the reflected ambient temperature
Tatm = the temperature of the atmosphere

The best results are obtained when you are diligent in entering known values for all the pertinent variables into the camera software. Emissivity tables are available for a wide variety of common substances. However, when in doubt, measurements should be made to obtain the correct values.

Calibration and analysis software tools are not always contained onboard the camera; some cameras use external software that runs on a PC. Even high-end cameras are connected to PCs to expand their internal calibration, correction, and analysis capabilities. For example, FLIR’s ThermaCAM RToolsâ„¢ Software can serve a variety of functions from real-time image acquisition to post-acquisition analysis.

Whether the software is on the camera or an external PC, it should allow you to easily modify calibration variables. For instance, you should be able to enter and modify emissivity, atmospheric conditions, distances, and other ancillary data needed to calculate and represent the exact temperature of the object, both live and through saved data. The software also must provide post-measurement analysis capability so you can further modify these variables as needed.

Typical Camera Measurement Functions

IR cameras have various operating modes to ensure correct temperature measurements under different application conditions. Typical measurement functions include spotmeter, area, profile, isotherm, temperature range, and color or gray-scale settings.

Cursor functions allow easy selection of an area of interest, such as the crosshairs of the spot readings in Figure 2. In addition, the cursor may be able to select circle, square, and irregularly shaped polygon areas or create a line for a temperature profile. Once an area is selected, it can be frozen so that the camera can take a snapshot of that area. Alternatively, the camera image can remain live for observation of changes in temperature.

Figure 2. IR Image of a PCB Indicating Three Spot Temperature ReadingsImage colors correspond to the temperature scale on the right.

The spotmeter finds the temperature at a particular point. Depending on the camera, this function may allow 10 or more movable spots, one or more of which may automatically find the hottest point in the image.

The area function isolates a selected area of an object or scene and finds the maximum, minimum, and average temperatures inside that area. The isotherm function makes it possible to portray the temperature distribution of a hot area. Multiple isotherms may be allowed. The line profile is a way to visualize the temperature along some part of the object, which also may be shown as a graph.

The temperature measurement range typically is selected by the user. This is a valuable feature when a scene has a temperature range narrower than a camera’s full-scale range.

Setting a narrower range allows better resolution of the images and higher accuracy in the measured temperatures. For that reason, images will better illustrate smaller temperature differences. On the other hand, a broader scale and higher maximum temperature range may be needed to prevent saturation of the portion of the image at the highest temperature.

As an adjunct to the temperature range selection, most cameras allow you to set up a color scale or gray scale to optimize the camera image. In Figure 2, an iron scale was used for a color rendering. In a manner similar to the gray scale, the hottest temperatures can be rendered as either lighter colors or darker colors.

Another possibility is rendering images with a rainbow scale (Figure 3). In some color images, gray will be used to indicate areas where the camera detector has become saturated.

Figure 3. Rainbow Scale Showing Lower Temperatures Toward the Blue End of the Spectrum

While choice of color scale often is a matter of personal preference, there may be times when one type of scale is better than another for illustrating the range of temperatures in a scene. In the case of isotherm measurements, areas with the same thermal radiance are highlighted. If we use a color scale with 10 colors, we will get 10 isotherms in the image. Such a scale sometimes makes it easier to see the temperature distribution over an object (Figure 4).

Figure 4. Isotherm Color Scale With Each Color Having an Isotherm Width of 2°C

An isothermal temperature-scale rendering will not be accurate unless all of the highlighted area has the same emissivity and the ambient temperatures are the same for all objects within the area. This points out common problems for IR camera users. Often, emissivity varies across an object or scene, along with variations in ambient temperatures, accompanied by atmospheric conditions that don’t match a camera’s default values. This is why IR cameras include measurement correction and calibration functions.

Emissivity Corrections

In most applications, the emissivity of an object is assumed based on values found in a table. Although camera software may include an emissivity table, you usually have the capability of inputting emissivity values for an object ranging from 0.1 to 1.0. Many cameras also provide automatic corrections based on user input for reflected ambient temperature, viewing distance, relative humidity, atmospheric transmission, and external optics.

The IR camera calculates a temperature based on a radiance measurement and the object’s emissivity. However, when the emissivity value is unknown or uncertain, the reverse process can be applied. Knowing the object’s temperature, emissivity can be calculated. This usually is done when exact emissivity values are needed. There are two common methods of doing this:

Equalization Box
The first method establishes a known temperature by using an equalization box. This is essentially a tightly controlled temperature chamber with circulating hot air.

The length of time in the box must be sufficient to allow the whole object to be at a uniform temperature. In addition, it is absolutely necessary that the object stabilize at a temperature different from the surroundings where the actual measurements will take place. Usually, the object is heated to a temperature at least 10°C above the surroundings to ensure that the thermodynamics of the measurements are valid.

Once the object has reached the set temperature, the lid is drawn off, and a thermogram of the object is captured. The camera and the software for processing thermograms can be used to get the emissivity value.

Adjacent Spot
The adjacent-spot method is much simpler but still gives reasonably exact values of emissivity. The object is adjusted so the area with unknown emissivity is very close to an area of known emissivity.

The distance separating these areas must be so small that it can be safely assumed they have the same temperature. From this temperature measurement, the unknown emissivity can be calculated. However, it is never possible to measure the emissivity of an object whose temperature is the same as the ambient temperature reflected from its surroundings.

Generally, you can input other variables that are needed to correct for ambient conditions. These include factors for ambient temperatures and atmospheric attenuation around the target object.

Using Camera FOV Specifications

When considering IR camera performance, most users are interested in how small an object or area can be detected and accurately measured at a given distance. Knowing a camera’s field of view (FOV) specifications helps provide the answer.

Field of View
FOV depends on the camera lens and focal plane dimensions and is expressed in degrees such as 35.5° x 28.7°. For a given viewing distance, this determines the dimensions of the total surface area seen by the instrument; for instance, 0.64 meter x 0.51 meter at a distance of 1 meter.

Instantaneous Field of View
The instantaneous FOV (IFOV) is a measure of the spatial resolution of a camera’s focal plane array (FPA) detector and stated in terms of individual detector elements or pixels. For example, an FPA with 640 x 512 individual detectors has a total of 327,680 pixels. For a given viewing distance, the IFOV covered by an individual pixel is

FOV/Total Number of Pixels

For an FOV of 0.64 x 0.51 at 1 meter and a 640 x 512 FPA, the IFOV of one pixel is an area about 1.0 mm x 1.0 mm at that distance.

It is important to know the IFOV relative to the size of a target object to ensure that areas outside that target are not measured. Consider the pixel IFOV relative to the target object size in
Figure 5. In the left view of this figure, the area of the object to be measured covers the IFOV completely. As a result, the pixel will receive radiation only from the object, and its temperature can be measured correctly.

Figure 5. IFOV (Red Squares) Relative to Object Size

In the right view of Figure 5, the pixel covers more than the target object area and will pick up radiation from extraneous objects. If the object is hotter than the objects beside or behind it, the temperature reading will be too low and vice versa. For that reason, it is important to estimate the size of the target object compared to the IFOV in each measurement situation.

Spot Size Ratio
At the start of a measurement session, the distance between the camera and the target object should be considered explicitly. For cameras that do not have a calibrated spot size, the spot size ratio (SSR) method can be used to optimize measurement results.

SSR is a number that tells how far the camera can be from a target object of a given size to get a good temperature measurement. A typical figure might be 1,000:1. This can be interpreted as follows: At 1,000 mm from a target, the camera will measure a temperature averaged over a 1-mm square.

SSR is not just for targets far away. It can be just as important for close-up work. However, the camera’s minimum focal distance also must be considered. For shorter target distances, some manufacturers offer close-up lenses.

For any application and camera/lens combination, the following equation applies:

D:S = SSR

where: D = the distance from the camera to the target
S = smallest target dimension of interest

The units of D and S must be the same.

When selecting a camera, keep in mind that IFOV is a good figure of merit to use. The smaller the IFOV, the better the camera for a given total FOV.

Other Tools for Camera Users

IR cameras are calibrated at the factory, and field calibration is not practical. However, some cameras have a built-in blackbody to allow a quick calibration check. These checks should be done periodically to ensure valid measurements.

Availability of compatible or bundled software is an important consideration when selecting an IR camera for an application or work environment. Bundled and optional data acquisition software programs allow easy data capture, viewing, analysis, and storage.

Software functions may include real-time radiometric output of radiance, radiant intensity, temperature, and target length/area. Optional software modules offer spatial and spectral radiometric calibration.

In addition, IR camera software and firmware provide other user inputs to refine the accuracy of temperature measurements. One of the most important functions is nonuniformity correction (NUC) of the detector FPA. This type of correction is needed because each individual detector in the camera’s FPA has a slightly different gain and zero offset. To create a useful thermographic image, the different gains and offsets must be corrected to a normalized value.

This multistep NUC process is performed by camera software. However, some software allows you to specify the manner in which NUC is performed using a list of menu options. For example, you may be able to specify either a one-point or a two-point correction. A one-point correction only deals with pixel offset. Two-point corrections perform both a gain and offset normalization of pixel-to-pixel nonuniformity.

An important NUC consideration is how this function deals with the imperfections that most FPAs have as a result of semiconductor wafer processing. Some of these imperfections are manifested as bad pixels that produce no output signals or outputs far outside of a correctable range. Ideally, the NUC process identifies bad pixels and replaces them using a nearest-neighbor replacement algorithm. Bad pixels are identified based on a response and offset level outside user-defined points from the mean response and absolute offset level.

Conclusions

Recent advances in IR cameras have made them much easier to use. Camera firmware has made setup and operation as simple as using a conventional video camera. Onboard and PC-based software provide measurement and analysis tools.

Nevertheless, for accurate results, you need an understanding of IR camera optical principals and calibration methods. At the very least, the emissivity of a target object should be entered into the camera data base if not already available as a table entry.

About the Author
Dave Bursell is director of the Science Segment at FLIR Systems, where he is responsible for science grade product sales, marketing, support, and development. He has 10 years of experience in IR thermography, including the development of the Science Segment Group within the FLIR Systems Thermography Division. Mr. Bursell earned a bachelor’s degree in mechanical engineering from Brigham Young University. FLIR Systems, 866-837-3243, e-mail: [email protected]

FOR MORE INFORMATION
on infrared cameras and thermal imaging

Click here

December 2007

Sponsored Recommendations

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!