Near-Infrared Camera
Introduction
Vineyards, large farms, and NASA all use near-infrared photography for assessing plant health, usually by mounting expensive sensors on airplanes and satellites. At Public Lab, we've developed a Do-It-Yourself way to bring this technology to everyday people, enabling us to monitor our environment through quantifiable data.
We are currently running a Kickstarter for a version of this camera we call the Infragram. Read more about it here »
What is it good for?
- Take pictures to examine plant health in backyard gardens, farms, parks, and nearby wetlands
- Monitor your household plants
- Teach students about plant growth and photosynthesis
- Create exciting science fair projects
- Generate verifiable, open environmental data
- Check progress of environmental restoration projects
- Document unhealthy areas of your local ecology (for instance, algal blooms)
Here's an example of what one of our "Infragram" cameras sees (left) and the post-processing analysis which shows photosynthetic activity, or plant health (right). This picture was taken from a commercial airplane flight:
Background: satellite infrared imaging
The study of Earth's environment from space got its start in 1972 when the first Landsat satellite was launched. The multispectral scanner it carried, like the scanners on all subsequent Landsat satellites, recorded images with both visible and near infrared light. Remote sensing scientists quickly learned that by combining visible and infrared data, they could reveal critical information about the health of vegetation. For example, the normalized difference vegetation index (NDVI) highlights the difference between the red and infrared wavelengths that are reflected from vegetation. Because red light is used by plants for photosynthesis but infrared light is not, NDVI allows scientists to estimate the amount of healthy foliage in every satellite image. Thousands of scientists, including landscape ecologists, global change biologists, and habitat specialists have relied on these valuable satellite-based NDVI images for decades.
There are public sources of infrared photography for the US available through the Department of Agriculture -- NAIP and Vegscape -- but this imagery is not collected when, as often, or at useable scale for individuals who are managing small plots.
Caption: Normal color photo (top) and normalized difference vegetation index (NDVI) image. NDVI image was derived from two color channels in a single photo taken with a camera modified with a special infrared filter. Note that tree trunks, brown grass, and rocks have very low NDVI values because they are not photosynthetic. Healthy plants typically have NDVI values between 0.1 and 0.9. Images by Chris Fastie. Visit the gallery of high-res images by Chris Fastie
Point & shoot infrared photography
The goal of Public Lab's Infragram project is to bring the power of NDVI and other infrared vegetation imaging back to earth where everyone can now take close-up images of plants or landscapes and instantly learn about their health and vigor.
Chris Fastie's infrared/visible camera prototype
We are able to tweak a single camera to capture near-infrared, green, and blue light. This allows us to try to understand and quantify how much of the available light plants are metabolizing into sugar via photosynthesis. We do this by filtering out the red light, and reading infrared in its place using a piece of carefully chosen "NGB" filter. Read more about the development of this technique here. You can also learn more about how digital camera image sensors detect colors at this great tutorial by Bigshot.
How we do it
Basically, we remove the infrared blocking filter from a conventional digital camera and replace it with a carefully chosen "infrablue" filter. This lets the camera read infrared and visible light at the same time, but in different color channels.
While we used to use a two-camera system, research by Chris Fastie and other Public Lab contributors have led to the use of a single camera which can image in both infrared and visible light simultaneously. The Infrablue filter is just a piece of carefully chosen theater gel which was examined using a DIY spectrometer. You can use this filter to turn most webcams or cheap point-and-shoots into an infrared/visible camera.
How to process your images:
We're working on an easy process to generate composite, infrared + visible images that will reveal new details of plant health and photosynthesis. There are several approaches:
- The easiest way is to process your images online at the free, open source Infragram.org
- Ned Horning's PhotoMonitoring plugin
- Manual processing
- Using MapKnitter.org (deprecated)
- Command-line processing of single images and rendering of movies using a Python script. Source code is here
Note: Older versions of this page have been kept at the following wiki page: http://publiclab.org/wiki/near-infrared-camera-history