It happens to all of us. We read or hear something from an authoritative figure and pass the information along. Of course, it is quite possible that this person could be the one we merely perceive as more knowledgeable about a subject, not necessarily one with experimentally based facts to support their statements. Regardless, we pass this information along.
As infrared photography is a passion of mine, I have noted a number of statements that have been made in this field that have not included all the facts. This leads to a misunderstanding of the subject, thus when repeated, there is substantial room for error.
In this vein, I have looked for evidence to support or deny three statements that, with a lack of sufficient background information, are at best, confusing, and at worst, wrong.
Let's look at each one.
This statement can be read a number of ways, and depending upon the way you decide to read it, may be a true or false statement. A basic understanding of how film is manufactured will give us a clue as to the truth of this remark.
Silver halide, the component in film that creates the image, has an inherent sensitivity to blue and violet light. It cannot "see" red light, thus orthochromatic film (and photographic paper) can be handled with an orange or red safelight. We do want to record the color red, however, so the manufacturers of film inject dyes into the emulsion of black and white film to extend the film's sensitivity. Depending upon the exact film used, this dye extends the film's sensitivity to 630-660 nanometers (nm).
Konica and Ilford use dyes to extend the sensitivity of their infrared films in the neighborhood of 720-740nm. This film is occasionally referred to as "extended-red." The dyes in Kodak's infrared film extend this sensitivity to 900nm, resulting in a more dramatic difference from common black and white film.
So light, be it in the visible or infrared range, is still necessary to form an image on infrared film. This film, however, has been used for surveillance in areas that are unlit or poorly lit, as well as behavioral studies of nocturnal animals. The most famous infrared image is that of people in a dark theater watching a movie with 3-D glasses, published in an issue of Life magazine. So, what is going on?
The answer is simple - infrared flash. By placing a filter over a flash unit that blocks visible light but allows infrared light to pass (such as a Wratten #87 or #89), pictures can be taken in the dark. This does not mean that there is no light, it means that there is no visible light. So, depending upon how one reads the initial allegation, it can be either true or false.
Chlorophyll is the characteristic that gives leaves their green color. It enables the leaf to form starch by facilitating the absorption of carbon dioxide, and is as necessary to the plant as our lungs are to us. Therefore, it is natural to assume that the presence of chlorophyll is responsible for the means by which healthy plant leaves are rendered on infrared film. This is true, but not in the way you might think.
The fact is that chlorophyll is transparent to infrared radiation. When light enters the leaf, it is absorbed, reflected, and transmitted. Healthy green leaves reflect greatest in the 540-560nm range (green), the minimum in the blue and deep red ranges. Chlorophyll a and b (the two types of chlorophyll in green plants) absorb light in the 400-480nm (blue) and 620-680nm (red) ranges, allowing the infrared light to pass. Infrared radiation is not actinic in the photosynthetic process.
Other pigments, such as carotene and xanthophyll, are also transparent to infrared radiation. The result is that the leaf cells, not the chlorophyll, reflect the infrared light. But why so strongly?
When the light passes through the epidermis and palisade cells, it is scattered in the parenchyma because air fills the space between these cells parts. This is the reason for the strong reflection. An analogy would be newly fallen snow. This snow will appear brighter than compacted snow because of the airy spaces between the snow crystals, where light has an opportunity to bounce around. Similarly, the light that bounces around within the healthy leaf, returning to the camera, will appear lighter. So, you may consider the role chlorophyll has in infrared photography by what it does not do, as opposed to what it does.
Dr. Antje Pokorny Almeida from the Universidade de Coimbra responds:
What you say about IR light not being absorbed/actinic in photosynthesis is, of course, 100% correct. This, however, is not the problem or the reason you see an effect on IR film (and I insist that you do) - chlorophyll FLUORESCES strongly in the near infrared (around 700nm), every plant loses a certain amount of absorbed energy due to fluorescence (why this is so people in the field have been discussing for at least 30 yrs.).
The effect is generally more pronounced in young or dying leaves (more chlorophyll *uncoupled* from the photosynthetic apparatus). An active, healthy plant will emit least fluorscence (but it will alwyas emit some). The fact that dying leaves are more fluorescent than healthy ones has lead to the development of remote sensing devices that detect the amount of fluorescence emitted from, say, a tree after excitation with a short actinic laser pulse (by the DLR, the German equvalent of NASA) - every tree (leaf) has a *fluoresence signature* and you can mmediately tell from the fluorescence spectrum if the plant is happy or not - long before it starts showing normal signs of unhappiness, like wilting or plain dying.
Of course, whether you see chlorophyll fluorescence or not will depend on the sensitivity of your film, as the IR-reflectance will completely dominate the spectrum at longer wavelengths (under regular sunlight).
Thank you for your insights and share of knowledge Dr. Almeida!
I have heard this statement again and again, and it is a result of misunderstanding the characteristics of light. This incorrect information was reinforced in the March 1996 issue of Shutterbug, where the banner of the first article stated, "Organic elements give off heat which is readily detected by the film. People give off heat too
" Organic elements give off heat, and heat can be detected by the film, but organic elements do not give off heat that is detected by the film.
There are four categorizations of heat that give off infrared radiation:
All objects emit energy - the hotter the object, the greater the width and intensity of the emitted spectrum. Objects in the glowing range exhibit sufficient energy to display visible light with a dark-red color and can be rendered by ordinary panchromatic film. Objects in the hot-object range (heated soldering irons, etc.) present wavelengths that may be recorded on infrared film. In the calorific range, the infrared radiation is in the 1400nm and higher region, and the human body exhibits radiation in the 9,000-13,000nm area. As it is impossible to sensitize silver halide beyond 1,300nm, there is no way a human will exhibit infrared radiation in a recordable range, unless, of course, they are on fire.
Are there other misunderstandings due to inaccurately interpreted information? Of course, but knowledge of any subject will make one less prone to misunderstanding. It is always important to ensure that the statements we hear are eventually backed-up with fact.