The Infragram Kickstarter video, a great introduction to the project. Introduction Vineyards...
Public Lab is an open community which collaboratively develops accessible, open source, Do-It-Yourself technologies for investigating local environmental health and justice issues.
104 CURRENT | warren |
August 31, 2016 16:36
| about 8 years ago
The Infragram Kickstarter video, a great introduction to the project. IntroductionVineyards, large farms, and NASA all use near-infrared photography for assessing plant health, usually by mounting expensive sensors on airplanes and satellites. At Public Lab, we've developed a Do-It-Yourself way to take these kinds of photos, enabling us to monitor our environment through quantifiable data. Our technique uses a modified digital camera to capture near-infrared and blue light in the same image, but in different color channels. We then post-process the image (using Infragram.org) to attempt to infer how much it is photosynthesizing. This allows us to better understand and quantify how much of the available light plants are metabolizing into sugar via photosynthesis.
We ran a Kickstarter for a version of this camera we call the Infragram. Read more about it here » Here's the video from the Kickstarter, which offers a nice visual explanation of the technique: What is it good for?Multispectral or infrared/visible photography has seen a variety of applications in the decades since it was developed. We have focused on the following uses:
Notable uses include this photograph of an unidentified plume of material in the Gowanus Canal (and writeup by TechPresident) and a variety of projects at a small farm in New Hampshire at the annual iFarm event. The Louisiana Universities Marine Consortium has also collaborated with Public Lab contributors to measure wetlands loss following the Deepwater Horizon oil disaster. Here's an example of what one of our "Infragram" cameras sees (left) and the post-processing analysis which shows photosynthetic activity, or plant health (right). This picture was taken from a commercial airplane flight: How does it work?Camera modification: We've worked on several different techniques, from dual camera systems to the current, single-camera technique. This involves removing the infrared-blocking filter from almost any digital camera, and adding a specific blue filter. This filters out the red light, and measures infrared light in its place using a piece of carefully chosen "NGB" or "infrablue" filter. Read more about the development of this technique here. You can also learn more about how digital camera image sensors detect colors at this great tutorial by Bigshot. Post-processing: Once you take a multispectral photograph with a modified camera, you must post-process it, compositing the infrared and visible data to generate a new image which (if it works) displays healthy, photosynthetically active areas as bright regions. An in-depth article on the technique by Chris Fastie (albeit using red instead of blue for visible light) can be found here. History of the project: While we used to use a two-camera system, research by Chris Fastie and other Public Lab contributors have led to the use of a single camera which can image in both infrared and visible light simultaneously. The Infrablue filter is just a piece of carefully chosen theater gel which was examined using a DIY spectrometer. You can use this filter to turn most webcams or cheap point-and-shoots into an infrared/visible camera. Background: satellite infrared imagingThe study of Earth's environment from space got its start in 1972 when the first Landsat satellite was launched. The multispectral scanner it carried, like the scanners on all subsequent Landsat satellites, recorded images with both visible and near infrared light. Remote sensing "scientists" quickly learned that by combining visible and infrared data, they could reveal critical information about the health of vegetation. For example, the normalized difference vegetation index (NDVI) highlights the difference between the red and infrared wavelengths that are reflected from vegetation. Because red light is used by plants for photosynthesis but infrared light is not, NDVI allows "scientists" to estimate the amount of healthy foliage in every satellite image. Thousands of "scientists", including landscape ecologists, global change biologists, and habitat specialists have relied on these valuable satellite-based NDVI images for decades. There are public sources of infrared photography for the US available through the Department of Agriculture -- NAIP and Vegscape -- but this imagery is not collected when, as often, or at useable scale for individuals who are managing small plots.
Caption: Normal color photo (top) and normalized difference vegetation index (NDVI) image. NDVI image was derived from two color channels in a single photo taken with a camera modified with a special infrared filter. Note that tree trunks, brown grass, and rocks have very low NDVI values because they are not photosynthetic. Healthy plants typically have NDVI values between 0.1 and 0.9. Images by Chris Fastie. Visit the gallery of high-res images by Chris Fastie Frequently Asked QuestionsAsk a question about infrared imaging [notes:question:infragram] How to process your images(this section is moved to and updated at http://publiclab.org/wiki/near-infrared-imaging) We're working on an easy process to generate composite, infrared + visible images that will reveal new details of plant health and photosynthesis. There are several approaches:
Note: Older versions of this page have been kept at the following wiki page: http://publiclab.org/wiki/near-infrared-camera-history |
Revert | |
103 | warren |
August 26, 2016 17:16
| about 8 years ago
The Infragram Kickstarter video, a great introduction to the project. IntroductionVineyards, large farms, and NASA all use near-infrared photography for assessing plant health, usually by mounting expensive sensors on airplanes and satellites. At Public Lab, we've developed a Do-It-Yourself way to take these kinds of photos, enabling us to monitor our environment through quantifiable data. Our technique uses a modified digital camera to capture near-infrared and blue light in the same image, but in different color channels. We then post-process the image (using Infragram.org) to attempt to infer how much it is photosynthesizing. This allows us to better understand and quantify how much of the available light plants are metabolizing into sugar via photosynthesis.
We ran a Kickstarter for a version of this camera we call the Infragram. Read more about it here » Here's the video from the Kickstarter, which offers a nice visual explanation of the technique: What is it good for?Multispectral or infrared/visible photography has seen a variety of applications in the decades since it was developed. We have focused on the following uses:
Notable uses include this photograph of an unidentified plume of material in the Gowanus Canal (and writeup by TechPresident) and a variety of projects at a small farm in New Hampshire at the annual iFarm event. The Louisiana Universities Marine Consortium has also collaborated with Public Lab contributors to measure wetlands loss following the Deepwater Horizon oil disaster. Here's an example of what one of our "Infragram" cameras sees (left) and the post-processing analysis which shows photosynthetic activity, or plant health (right). This picture was taken from a commercial airplane flight: How does it work?Camera modification: We've worked on several different techniques, from dual camera systems to the current, single-camera technique. This involves removing the infrared-blocking filter from almost any digital camera, and adding a specific blue filter. This filters out the red light, and measures infrared light in its place using a piece of carefully chosen "NGB" or "infrablue" filter. Read more about the development of this technique here. You can also learn more about how digital camera image sensors detect colors at this great tutorial by Bigshot. Post-processing: Once you take a multispectral photograph with a modified camera, you must post-process it, compositing the infrared and visible data to generate a new image which (if it works) displays healthy, photosynthetically active areas as bright regions. An in-depth article on the technique by Chris Fastie (albeit using red instead of blue for visible light) can be found here. History of the project: While we used to use a two-camera system, research by Chris Fastie and other Public Lab contributors have led to the use of a single camera which can image in both infrared and visible light simultaneously. The Infrablue filter is just a piece of carefully chosen theater gel which was examined using a DIY spectrometer. You can use this filter to turn most webcams or cheap point-and-shoots into an infrared/visible camera. Background: satellite infrared imagingThe study of Earth's environment from space got its start in 1972 when the first Landsat satellite was launched. The multispectral scanner it carried, like the scanners on all subsequent Landsat satellites, recorded images with both visible and near infrared light. Remote sensing "scientists" quickly learned that by combining visible and infrared data, they could reveal critical information about the health of vegetation. For example, the normalized difference vegetation index (NDVI) highlights the difference between the red and infrared wavelengths that are reflected from vegetation. Because red light is used by plants for photosynthesis but infrared light is not, NDVI allows "scientists" to estimate the amount of healthy foliage in every satellite image. Thousands of "scientists", including landscape ecologists, global change biologists, and habitat specialists have relied on these valuable satellite-based NDVI images for decades. There are public sources of infrared photography for the US available through the Department of Agriculture -- NAIP and Vegscape -- but this imagery is not collected when, as often, or at useable scale for individuals who are managing small plots.
Caption: Normal color photo (top) and normalized difference vegetation index (NDVI) image. NDVI image was derived from two color channels in a single photo taken with a camera modified with a special infrared filter. Note that tree trunks, brown grass, and rocks have very low NDVI values because they are not photosynthetic. Healthy plants typically have NDVI values between 0.1 and 0.9. Images by Chris Fastie. Visit the gallery of high-res images by Chris Fastie Frequently Asked Questions[notes:question:infrared] How to process your images(this section is moved to and updated at http://publiclab.org/wiki/near-infrared-imaging) We're working on an easy process to generate composite, infrared + visible images that will reveal new details of plant health and photosynthesis. There are several approaches:
Note: Older versions of this page have been kept at the following wiki page: http://publiclab.org/wiki/near-infrared-camera-history |
Revert | |
102 | liz |
August 08, 2016 15:34
| over 8 years ago
The Infragram Kickstarter video, a great introduction to the project. IntroductionVineyards, large farms, and NASA all use near-infrared photography for assessing plant health, usually by mounting expensive sensors on airplanes and satellites. At Public Lab, we've developed a Do-It-Yourself way to take these kinds of photos, enabling us to monitor our environment through quantifiable data. Our technique uses a modified digital camera to capture near-infrared and blue light in the same image, but in different color channels. We then post-process the image (using Infragram.org) to attempt to infer how much it is photosynthesizing. This allows us to better understand and quantify how much of the available light plants are metabolizing into sugar via photosynthesis.
We ran a Kickstarter for a version of this camera we call the Infragram. Read more about it here » Here's the video from the Kickstarter, which offers a nice visual explanation of the technique: What is it good for?Multispectral or infrared/visible photography has seen a variety of applications in the decades since it was developed. We have focused on the following uses:
Notable uses include this photograph of an unidentified plume of material in the Gowanus Canal (and writeup by TechPresident) and a variety of projects at a small farm in New Hampshire at the annual iFarm event. The Louisiana Universities Marine Consortium has also collaborated with Public Lab contributors to measure wetlands loss following the Deepwater Horizon oil disaster. Here's an example of what one of our "Infragram" cameras sees (left) and the post-processing analysis which shows photosynthetic activity, or plant health (right). This picture was taken from a commercial airplane flight: How does it work?Camera modification: We've worked on several different techniques, from dual camera systems to the current, single-camera technique. This involves removing the infrared-blocking filter from almost any digital camera, and adding a specific blue filter. This filters out the red light, and measures infrared light in its place using a piece of carefully chosen "NGB" or "infrablue" filter. Read more about the development of this technique here. You can also learn more about how digital camera image sensors detect colors at this great tutorial by Bigshot. Post-processing: Once you take a multispectral photograph with a modified camera, you must post-process it, compositing the infrared and visible data to generate a new image which (if it works) displays healthy, photosynthetically active areas as bright regions. An in-depth article on the technique by Chris Fastie (albeit using red instead of blue for visible light) can be found here. History of the project: While we used to use a two-camera system, research by Chris Fastie and other Public Lab contributors have led to the use of a single camera which can image in both infrared and visible light simultaneously. The Infrablue filter is just a piece of carefully chosen theater gel which was examined using a DIY spectrometer. You can use this filter to turn most webcams or cheap point-and-shoots into an infrared/visible camera. Background: satellite infrared imagingThe study of Earth's environment from space got its start in 1972 when the first Landsat satellite was launched. The multispectral scanner it carried, like the scanners on all subsequent Landsat satellites, recorded images with both visible and near infrared light. Remote sensing "scientists" quickly learned that by combining visible and infrared data, they could reveal critical information about the health of vegetation. For example, the normalized difference vegetation index (NDVI) highlights the difference between the red and infrared wavelengths that are reflected from vegetation. Because red light is used by plants for photosynthesis but infrared light is not, NDVI allows "scientists" to estimate the amount of healthy foliage in every satellite image. Thousands of "scientists", including landscape ecologists, global change biologists, and habitat specialists have relied on these valuable satellite-based NDVI images for decades. There are public sources of infrared photography for the US available through the Department of Agriculture -- NAIP and Vegscape -- but this imagery is not collected when, as often, or at useable scale for individuals who are managing small plots.
Caption: Normal color photo (top) and normalized difference vegetation index (NDVI) image. NDVI image was derived from two color channels in a single photo taken with a camera modified with a special infrared filter. Note that tree trunks, brown grass, and rocks have very low NDVI values because they are not photosynthetic. Healthy plants typically have NDVI values between 0.1 and 0.9. Images by Chris Fastie. Visit the gallery of high-res images by Chris Fastie How to process your images: (this section is moved to and updated at http://publiclab.org/wiki/near-infrared-imaging)We're working on an easy process to generate composite, infrared + visible images that will reveal new details of plant health and photosynthesis. There are several approaches:
Note: Older versions of this page have been kept at the following wiki page: http://publiclab.org/wiki/near-infrared-camera-history |
Revert | |
101 | warren |
June 25, 2015 17:21
| over 9 years ago
The Infragram Kickstarter video, a great introduction to the project. IntroductionVineyards, large farms, and NASA all use near-infrared photography for assessing plant health, usually by mounting expensive sensors on airplanes and satellites. At Public Lab, we've developed a Do-It-Yourself way to take these kinds of photos, enabling us to monitor our environment through quantifiable data. Our technique uses a modified digital camera to capture near-infrared and blue light in the same image, but in different color channels. We then post-process the image (using Infragram.org) to attempt to infer how much it is photosynthesizing. This allows us to better understand and quantify how much of the available light plants are metabolizing into sugar via photosynthesis.
We ran a Kickstarter for a version of this camera we call the Infragram. Read more about it here » Here's the video from the Kickstarter, which offers a nice visual explanation of the technique: What is it good for?Multispectral or infrared/visible photography has seen a variety of applications in the decades since it was developed. We have focused on the following uses:
Notable uses include this photograph of an unidentified plume of material in the Gowanus Canal (and writeup by TechPresident) and a variety of projects at a small farm in New Hampshire at the annual iFarm event. The Louisiana Universities Marine Consortium has also collaborated with Public Lab contributors to measure wetlands loss following the Deepwater Horizon oil disaster. Here's an example of what one of our "Infragram" cameras sees (left) and the post-processing analysis which shows photosynthetic activity, or plant health (right). This picture was taken from a commercial airplane flight: How does it work?Camera modification: We've worked on several different techniques, from dual camera systems to the current, single-camera technique. This involves removing the infrared-blocking filter from almost any digital camera, and adding a specific blue filter. This filters out the red light, and measures infrared light in its place using a piece of carefully chosen "NGB" or "infrablue" filter. Read more about the development of this technique here. You can also learn more about how digital camera image sensors detect colors at this great tutorial by Bigshot. Post-processing: Once you take a multispectral photograph with a modified camera, you must post-process it, compositing the infrared and visible data to generate a new image which (if it works) displays healthy, photosynthetically active areas as bright regions. An in-depth article on the technique by Chris Fastie (albeit using red instead of blue for visible light) can be found here. History of the project: While we used to use a two-camera system, research by Chris Fastie and other Public Lab contributors have led to the use of a single camera which can image in both infrared and visible light simultaneously. The Infrablue filter is just a piece of carefully chosen theater gel which was examined using a DIY spectrometer. You can use this filter to turn most webcams or cheap point-and-shoots into an infrared/visible camera. Background: satellite infrared imagingThe study of Earth's environment from space got its start in 1972 when the first Landsat satellite was launched. The multispectral scanner it carried, like the scanners on all subsequent Landsat satellites, recorded images with both visible and near infrared light. Remote sensing "scientists" quickly learned that by combining visible and infrared data, they could reveal critical information about the health of vegetation. For example, the normalized difference vegetation index (NDVI) highlights the difference between the red and infrared wavelengths that are reflected from vegetation. Because red light is used by plants for photosynthesis but infrared light is not, NDVI allows "scientists" to estimate the amount of healthy foliage in every satellite image. Thousands of "scientists", including landscape ecologists, global change biologists, and habitat specialists have relied on these valuable satellite-based NDVI images for decades. There are public sources of infrared photography for the US available through the Department of Agriculture -- NAIP and Vegscape -- but this imagery is not collected when, as often, or at useable scale for individuals who are managing small plots.
Caption: Normal color photo (top) and normalized difference vegetation index (NDVI) image. NDVI image was derived from two color channels in a single photo taken with a camera modified with a special infrared filter. Note that tree trunks, brown grass, and rocks have very low NDVI values because they are not photosynthetic. Healthy plants typically have NDVI values between 0.1 and 0.9. Images by Chris Fastie. Visit the gallery of high-res images by Chris Fastie How to process your images:We're working on an easy process to generate composite, infrared + visible images that will reveal new details of plant health and photosynthesis. There are several approaches:
Note: Older versions of this page have been kept at the following wiki page: http://publiclab.org/wiki/near-infrared-camera-history |
Revert | |
100 | Shannon |
September 03, 2014 21:11
| about 10 years ago
The Infragram Kickstarter video, a great introduction to the project. IntroductionVineyards, large farms, and NASA all use near-infrared photography for assessing plant health, usually by mounting expensive sensors on airplanes and satellites. At Public Lab, we've developed a Do-It-Yourself way to take these kinds of photos, enabling us to monitor our environment through quantifiable data. Our technique uses a modified digital camera to capture near-infrared and blue light in the same image, but in different color channels. We then post-process the image to attempt to infer how much it is photosynthesizing. This allows us to better understand and quantify how much of the available light plants are metabolizing into sugar via photosynthesis.
We ran a Kickstarter for a version of this camera we call the Infragram. Read more about it here » Here's the video from the Kickstarter, which offers a nice visual explanation of the technique: What is it good for?Multispectral or infrared/visible photography has seen a variety of applications in the decades since it was developed. We have focused on the following uses:
Notable uses include this photograph of an unidentified plume of material in the Gowanus Canal (and writeup by TechPresident) and a variety of projects at a small farm in New Hampshire at the annual iFarm event. The Louisiana Universities Marine Consortium has also collaborated with Public Lab contributors to measure wetlands loss following the Deepwater Horizon oil disaster. Here's an example of what one of our "Infragram" cameras sees (left) and the post-processing analysis which shows photosynthetic activity, or plant health (right). This picture was taken from a commercial airplane flight: How does it work?Camera modification: We've worked on several different techniques, from dual camera systems to the current, single-camera technique. This involves removing the infrared-blocking filter from almost any digital camera, and adding a specific blue filter. This filters out the red light, and measures infrared light in its place using a piece of carefully chosen "NGB" or "infrablue" filter. Read more about the development of this technique here. You can also learn more about how digital camera image sensors detect colors at this great tutorial by Bigshot. Post-processing: Once you take a multispectral photograph with a modified camera, you must post-process it, compositing the infrared and visible data to generate a new image which (if it works) displays healthy, photosynthetically active areas as bright regions. An in-depth article on the technique by Chris Fastie (albeit using red instead of blue for visible light) can be found here. History of the project: While we used to use a two-camera system, research by Chris Fastie and other Public Lab contributors have led to the use of a single camera which can image in both infrared and visible light simultaneously. The Infrablue filter is just a piece of carefully chosen theater gel which was examined using a DIY spectrometer. You can use this filter to turn most webcams or cheap point-and-shoots into an infrared/visible camera. Background: satellite infrared imagingThe study of Earth's environment from space got its start in 1972 when the first Landsat satellite was launched. The multispectral scanner it carried, like the scanners on all subsequent Landsat satellites, recorded images with both visible and near infrared light. Remote sensing scientists quickly learned that by combining visible and infrared data, they could reveal critical information about the health of vegetation. For example, the normalized difference vegetation index (NDVI) highlights the difference between the red and infrared wavelengths that are reflected from vegetation. Because red light is used by plants for photosynthesis but infrared light is not, NDVI allows scientists to estimate the amount of healthy foliage in every satellite image. Thousands of scientists, including landscape ecologists, global change biologists, and habitat specialists have relied on these valuable satellite-based NDVI images for decades. There are public sources of infrared photography for the US available through the Department of Agriculture -- NAIP and Vegscape -- but this imagery is not collected when, as often, or at useable scale for individuals who are managing small plots.
Caption: Normal color photo (top) and normalized difference vegetation index (NDVI) image. NDVI image was derived from two color channels in a single photo taken with a camera modified with a special infrared filter. Note that tree trunks, brown grass, and rocks have very low NDVI values because they are not photosynthetic. Healthy plants typically have NDVI values between 0.1 and 0.9. Images by Chris Fastie. Visit the gallery of high-res images by Chris Fastie How to process your images:We're working on an easy process to generate composite, infrared + visible images that will reveal new details of plant health and photosynthesis. There are several approaches:
Note: Older versions of this page have been kept at the following wiki page: http://publiclab.org/wiki/near-infrared-camera-history |
Revert | |
99 | gonzoearth |
May 05, 2014 18:28
| over 10 years ago
The Infragram Kickstarter video, a great introduction to the project. IntroductionVineyards, large farms, and NASA all use near-infrared photography for assessing plant health, usually by mounting expensive sensors on airplanes and satellites. At Public Lab, we've developed a Do-It-Yourself way to take these kinds of photos, enabling us to monitor our environment through quantifiable data. Our technique uses a modified digital camera to capture near-infrared and blue light in the same image, but in different color channels. We then post-process the image to attempt to infer how much it is photosynthesizing. This allows us to better understand and quantify how much of the available light plants are metabolizing into sugar via photosynthesis.
We recently ran a Kickstarter for a version of this camera we call the Infragram. Read more about it here » Here's the video from the Kickstarter, which offers a nice visual explanation of the technique: What is it good for?Multispectral or infrared/visible photography has seen a variety of applications in the decades since it was developed. We have focused on the following uses:
Notable uses include this photograph of an unidentified plume of material in the Gowanus Canal (and writeup by TechPresident) and a variety of projects at a small farm in New Hampshire at the annual iFarm event. The Louisiana Universities Marine Consortium has also collaborated with Public Lab contributors to measure wetlands loss following the Deepwater Horizon oil disaster. Here's an example of what one of our "Infragram" cameras sees (left) and the post-processing analysis which shows photosynthetic activity, or plant health (right). This picture was taken from a commercial airplane flight: How does it work?Camera modification: We've worked on several different techniques, from dual camera systems to the current, single-camera technique. This involves removing the infrared-blocking filter from almost any digital camera, and adding a specific blue filter. This filters out the red light, and measures infrared light in its place using a piece of carefully chosen "NGB" or "infrablue" filter. Read more about the development of this technique here. You can also learn more about how digital camera image sensors detect colors at this great tutorial by Bigshot. Post-processing: Once you take a multispectral photograph with a modified camera, you must post-process it, compositing the infrared and visible data to generate a new image which (if it works) displays healthy, photosynthetically active areas as bright regions. An in-depth article on the technique by Chris Fastie (albeit using red instead of blue for visible light) can be found here. History of the project: While we used to use a two-camera system, research by Chris Fastie and other Public Lab contributors have led to the use of a single camera which can image in both infrared and visible light simultaneously. The Infrablue filter is just a piece of carefully chosen theater gel which was examined using a DIY spectrometer. You can use this filter to turn most webcams or cheap point-and-shoots into an infrared/visible camera. Background: satellite infrared imagingThe study of Earth's environment from space got its start in 1972 when the first Landsat satellite was launched. The multispectral scanner it carried, like the scanners on all subsequent Landsat satellites, recorded images with both visible and near infrared light. Remote sensing scientists quickly learned that by combining visible and infrared data, they could reveal critical information about the health of vegetation. For example, the normalized difference vegetation index (NDVI) highlights the difference between the red and infrared wavelengths that are reflected from vegetation. Because red light is used by plants for photosynthesis but infrared light is not, NDVI allows scientists to estimate the amount of healthy foliage in every satellite image. Thousands of scientists, including landscape ecologists, global change biologists, and habitat specialists have relied on these valuable satellite-based NDVI images for decades. There are public sources of infrared photography for the US available through the Department of Agriculture -- NAIP and Vegscape -- but this imagery is not collected when, as often, or at useable scale for individuals who are managing small plots.
Caption: Normal color photo (top) and normalized difference vegetation index (NDVI) image. NDVI image was derived from two color channels in a single photo taken with a camera modified with a special infrared filter. Note that tree trunks, brown grass, and rocks have very low NDVI values because they are not photosynthetic. Healthy plants typically have NDVI values between 0.1 and 0.9. Images by Chris Fastie. Visit the gallery of high-res images by Chris Fastie How to process your images:We're working on an easy process to generate composite, infrared + visible images that will reveal new details of plant health and photosynthesis. There are several approaches:
Note: Older versions of this page have been kept at the following wiki page: http://publiclab.org/wiki/near-infrared-camera-history |
Revert | |
98 | mathew |
March 19, 2014 23:50
| over 10 years ago
The Infragram Kickstarter video, a great introduction to the project. IntroductionVineyards, large farms, and NASA all use near-infrared photography for assessing plant health, usually by mounting expensive sensors on airplanes and satellites. At Public Lab, we've developed a Do-It-Yourself way to take these kinds of photos, enabling us to monitor our environment through quantifiable data. Our technique uses a modified digital camera to capture near-infrared and blue light in the same image, but in different color channels. We then post-process the image to attempt to infer how much it is photosynthesizing. This allows us to better understand and quantify how much of the available light plants are metabolizing into sugar via photosynthesis.
We recently ran a Kickstarter for a version of this camera we call the Infragram. Read more about it here » Here's the video from the Kickstarter, which offers a nice visual explanation of the technique: What is it good for?Multispectral or infrared/visible photography has seen a variety of applications in the decades since it was developed. We have focused on the following uses:
Notable uses include this photograph of an unidentified plume of material in the Gowanus Canal (and writeup by TechPresident) and a variety of projects at a small farm in New Hampshire at the annual iFarm event. The Louisiana Universities Marine Consortium has also collaborated with Public Lab contributors to measure wetlands loss following the Deepwater Horizon oil disaster. Here's an example of what one of our "Infragrxam" cameras sees (left) and the post-processing analysis which shows photosynthetic activity, or plant health (right). This picture was taken from a commercial airplane flight: How does it work?Camera modification: We've worked on several different techniques, from dual camera systems to the current, single-camera technique. This involves removing the infrared-blocking filter from almost any digital camera, and adding a specific blue filter. This filters out the red light, and measures infrared light in its place using a piece of carefully chosen "NGB" or "infrablue" filter. Read more about the development of this technique here. You can also learn more about how digital camera image sensors detect colors at this great tutorial by Bigshot. Post-processing: Once you take a multispectral photograph with a modified camera, you must post-process it, compositing the infrared and visible data to generate a new image which (if it works) displays healthy, photosynthetically active areas as bright regions. An in-depth article on the technique by Chris Fastie (albeit using red instead of blue for visible light) can be found here. History of the project: While we used to use a two-camera system, research by Chris Fastie and other Public Lab contributors have led to the use of a single camera which can image in both infrared and visible light simultaneously. The Infrablue filter is just a piece of carefully chosen theater gel which was examined using a DIY spectrometer. You can use this filter to turn most webcams or cheap point-and-shoots into an infrared/visible camera. Background: satellite infrared imagingThe study of Earth's environment from space got its start in 1972 when the first Landsat satellite was launched. The multispectral scanner it carried, like the scanners on all subsequent Landsat satellites, recorded images with both visible and near infrared light. Remote sensing scientists quickly learned that by combining visible and infrared data, they could reveal critical information about the health of vegetation. For example, the normalized difference vegetation index (NDVI) highlights the difference between the red and infrared wavelengths that are reflected from vegetation. Because red light is used by plants for photosynthesis but infrared light is not, NDVI allows scientists to estimate the amount of healthy foliage in every satellite image. Thousands of scientists, including landscape ecologists, global change biologists, and habitat specialists have relied on these valuable satellite-based NDVI images for decades. There are public sources of infrared photography for the US available through the Department of Agriculture -- NAIP and Vegscape -- but this imagery is not collected when, as often, or at useable scale for individuals who are managing small plots.
Caption: Normal color photo (top) and normalized difference vegetation index (NDVI) image. NDVI image was derived from two color channels in a single photo taken with a camera modified with a special infrared filter. Note that tree trunks, brown grass, and rocks have very low NDVI values because they are not photosynthetic. Healthy plants typically have NDVI values between 0.1 and 0.9. Images by Chris Fastie. Visit the gallery of high-res images by Chris Fastie How to process your images:We're working on an easy process to generate composite, infrared + visible images that will reveal new details of plant health and photosynthesis. There are several approaches:
Note: Older versions of this page have been kept at the following wiki page: http://publiclab.org/wiki/near-infrared-camera-history |
Revert | |
97 | warren |
October 10, 2013 14:14
| about 11 years ago
The Infragram Kickstarter video, a great introduction to the project. IntroductionVineyards, large farms, and NASA all use near-infrared photography for assessing plant health, usually by mounting expensive sensors on airplanes and satellites. At Public Lab, we've developed a Do-It-Yourself way to take these kinds of photos, enabling us to monitor our environment through quantifiable data. Our technique uses a modified digital camera to capture near-infrared and blue light in the same image, but in different color channels. We then post-process the image to attempt to infer how much it is photosynthesizing. This allows us to better understand and quantify how much of the available light plants are metabolizing into sugar via photosynthesis.
We recently ran a Kickstarter for a version of this camera we call the Infragram. Read more about it here » Here's the video from the Kickstarter, which offers a nice visual explanation of the technique: What is it good for?Multispectral or infrared/visible photography has seen a variety of applications in the decades since it was developed. We have focused on the following uses:
Notable uses include this photograph of an unidentified plume of material in the Gowanus Canal (and writeup by TechPresident) and a variety of projects at a small farm in New Hampshire at the annual iFarm event. The Louisiana Universities Marine Consortium has also collaborated with Public Lab contributors to measure wetlands loss following the Deepwater Horizon oil disaster. Here's an example of what one of our "Infragram" cameras sees (left) and the post-processing analysis which shows photosynthetic activity, or plant health (right). This picture was taken from a commercial airplane flight: How does it work?Camera modification: We've worked on several different techniques, from dual camera systems to the current, single-camera technique. This involves removing the infrared-blocking filter from almost any digital camera, and adding a specific blue filter. This filters out the red light, and measures infrared light in its place using a piece of carefully chosen "NGB" or "infrablue" filter. Read more about the development of this technique here. You can also learn more about how digital camera image sensors detect colors at this great tutorial by Bigshot. Post-processing: Once you take a multispectral photograph with a modified camera, you must post-process it, compositing the infrared and visible data to generate a new image which (if it works) displays healthy, photosynthetically active areas as bright regions. An in-depth article on the technique by Chris Fastie (albeit using red instead of blue for visible light) can be found here. History of the project: While we used to use a two-camera system, research by Chris Fastie and other Public Lab contributors have led to the use of a single camera which can image in both infrared and visible light simultaneously. The Infrablue filter is just a piece of carefully chosen theater gel which was examined using a DIY spectrometer. You can use this filter to turn most webcams or cheap point-and-shoots into an infrared/visible camera. Background: satellite infrared imagingThe study of Earth's environment from space got its start in 1972 when the first Landsat satellite was launched. The multispectral scanner it carried, like the scanners on all subsequent Landsat satellites, recorded images with both visible and near infrared light. Remote sensing scientists quickly learned that by combining visible and infrared data, they could reveal critical information about the health of vegetation. For example, the normalized difference vegetation index (NDVI) highlights the difference between the red and infrared wavelengths that are reflected from vegetation. Because red light is used by plants for photosynthesis but infrared light is not, NDVI allows scientists to estimate the amount of healthy foliage in every satellite image. Thousands of scientists, including landscape ecologists, global change biologists, and habitat specialists have relied on these valuable satellite-based NDVI images for decades. There are public sources of infrared photography for the US available through the Department of Agriculture -- NAIP and Vegscape -- but this imagery is not collected when, as often, or at useable scale for individuals who are managing small plots.
Caption: Normal color photo (top) and normalized difference vegetation index (NDVI) image. NDVI image was derived from two color channels in a single photo taken with a camera modified with a special infrared filter. Note that tree trunks, brown grass, and rocks have very low NDVI values because they are not photosynthetic. Healthy plants typically have NDVI values between 0.1 and 0.9. Images by Chris Fastie. Visit the gallery of high-res images by Chris Fastie How to process your images:We're working on an easy process to generate composite, infrared + visible images that will reveal new details of plant health and photosynthesis. There are several approaches:
Note: Older versions of this page have been kept at the following wiki page: http://publiclab.org/wiki/near-infrared-camera-history |
Revert | |
96 | warren |
October 10, 2013 14:09
| about 11 years ago
The Infragram Kickstarter video, a great introduction to the project. IntroductionVineyards, large farms, and NASA all use near-infrared photography for assessing plant health, usually by mounting expensive sensors on airplanes and satellites. At Public Lab, we've developed a Do-It-Yourself way to take these kinds of photos, enabling us to monitor our environment through quantifiable data. Our technique uses a modified digital camera to capture near-infrared and blue light in the same image, but in different color channels. We then post-process the image to attempt to infer how much it is photosynthesizing. This allows us to better understand and quantify how much of the available light plants are metabolizing into sugar via photosynthesis.
We recently ran a Kickstarter for a version of this camera we call the Infragram. Read more about it here » Here's the video from the Kickstarter, which offers a nice visual explanation of the technique: What is it good for?Multispectral or infrared/visible photography has seen a variety of applications in the decades since it was developed. We have focused on the following uses:
Notable uses include this photograph of an unidentified plume of material in the Gowanus Canal (and writeup by TechPresident) and a variety of projects at a small farm in New Hampshire at the annual iFarm event. The Louisiana Universities Marine Consortium has also collaborated with Public Lab contributors to measure wetlands loss following the Deepwater Horizon oil disaster. Here's an example of what one of our "Infragram" cameras sees (left) and the post-processing analysis which shows photosynthetic activity, or plant health (right). This picture was taken from a commercial airplane flight: How does it work?Camera modification: We've worked on several different techniques, from dual camera systems to the current, single-camera technique. This involves removing the infrared-blocking filter from almost any digital camera, and adding a specific blue filter. This filters out the red light, and measures infrared light in its place using a piece of carefully chosen "NGB" or "infrablue" filter. Read more about the development of this technique here. You can also learn more about how digital camera image sensors detect colors at this great tutorial by Bigshot. Post-processing: Once you take a multispectral photograph with a modified camera, you must post-process it, compositing the infrared and visible data to generate a new image which (if it works) displays healthy, photosynthetically active areas as bright regions. An in-depth article on the technique by Chris Fastie (albeit using red instead of blue for visible light) can be found here. History of the project: While we used to use a two-camera system, research by Chris Fastie and other Public Lab contributors have led to the use of a single camera which can image in both infrared and visible light simultaneously. The Infrablue filter is just a piece of carefully chosen theater gel which was examined using a DIY spectrometer. You can use this filter to turn most webcams or cheap point-and-shoots into an infrared/visible camera. Background: satellite infrared imagingThe study of Earth's environment from space got its start in 1972 when the first Landsat satellite was launched. The multispectral scanner it carried, like the scanners on all subsequent Landsat satellites, recorded images with both visible and near infrared light. Remote sensing scientists quickly learned that by combining visible and infrared data, they could reveal critical information about the health of vegetation. For example, the normalized difference vegetation index (NDVI) highlights the difference between the red and infrared wavelengths that are reflected from vegetation. Because red light is used by plants for photosynthesis but infrared light is not, NDVI allows scientists to estimate the amount of healthy foliage in every satellite image. Thousands of scientists, including landscape ecologists, global change biologists, and habitat specialists have relied on these valuable satellite-based NDVI images for decades. There are public sources of infrared photography for the US available through the Department of Agriculture -- NAIP and Vegscape -- but this imagery is not collected when, as often, or at useable scale for individuals who are managing small plots.
Caption: Normal color photo (top) and normalized difference vegetation index (NDVI) image. NDVI image was derived from two color channels in a single photo taken with a camera modified with a special infrared filter. Note that tree trunks, brown grass, and rocks have very low NDVI values because they are not photosynthetic. Healthy plants typically have NDVI values between 0.1 and 0.9. Images by Chris Fastie. Visit the gallery of high-res images by Chris Fastie How to process your images:We're working on an easy process to generate composite, infrared + visible images that will reveal new details of plant health and photosynthesis. There are several approaches:
Note: Older versions of this page have been kept at the following wiki page: http://publiclab.org/wiki/near-infrared-camera-history |
Revert | |
95 | warren |
October 10, 2013 14:07
| about 11 years ago
The Infragram Kickstarter video, a great introduction to the project. IntroductionVineyards, large farms, and NASA all use near-infrared photography for assessing plant health, usually by mounting expensive sensors on airplanes and satellites. At Public Lab, we've developed a Do-It-Yourself way to take these kinds of photos, enabling us to monitor our environment through quantifiable data. Our technique uses a modified digital camera to capture near-infrared and blue light in the same image, but in different color channels. We then post-process the image to attempt to infer how much it is photosynthesizing. This allows us to better understand and quantify how much of the available light plants are metabolizing into sugar via photosynthesis.
We recently ran a Kickstarter for a version of this camera we call the Infragram. Read more about it here » Here's the video from the Kickstarter, which offers a nice visual explanation of the technique: What is it good for?Multispectral or infrared/visible photography has seen a variety of applications in the decades since it was developed. We have focused on the following uses:
Notable uses include this photograph of an unidentified plume of material in the Gowanus Canal (and writeup by TechPresident) and a variety of projects at a small farm in New Hampshire at the annual iFarm event. The Louisiana Universities Marine Consortium has also collaborated with Public Lab contributors to measure wetlands loss following the Deepwater Horizon oil disaster. Here's an example of what one of our "Infragram" cameras sees (left) and the post-processing analysis which shows photosynthetic activity, or plant health (right). This picture was taken from a commercial airplane flight: How does it work?Camera modification: We've worked on several different techniques, from dual camera systems to the current, single-camera technique. This involves removing the infrared-blocking filter from almost any digital camera, and adding a specific blue filter. This filters out the red light, and measures infrared light in its place using a piece of carefully chosen "NGB" or "infrablue" filter. Read more about the development of this technique here. You can also learn more about how digital camera image sensors detect colors at this great tutorial by Bigshot. Post-processing: Once you take a multispectral photograph with a modified camera, you must post-process it, compositing the infrared and visible data to generate a new image which (if it works) displays healthy, photosynthetically active areas as bright regions. An in-depth article on the technique by Chris Fastie (albeit using red instead of blue for visible light) can be found here. History of the project: While we used to use a two-camera system, research by Chris Fastie and other Public Lab contributors have led to the use of a single camera which can image in both infrared and visible light simultaneously. The Infrablue filter is just a piece of carefully chosen theater gel which was examined using a DIY spectrometer. You can use this filter to turn most webcams or cheap point-and-shoots into an infrared/visible camera. Background: satellite infrared imagingThe study of Earth's environment from space got its start in 1972 when the first Landsat satellite was launched. The multispectral scanner it carried, like the scanners on all subsequent Landsat satellites, recorded images with both visible and near infrared light. Remote sensing scientists quickly learned that by combining visible and infrared data, they could reveal critical information about the health of vegetation. For example, the normalized difference vegetation index (NDVI) highlights the difference between the red and infrared wavelengths that are reflected from vegetation. Because red light is used by plants for photosynthesis but infrared light is not, NDVI allows scientists to estimate the amount of healthy foliage in every satellite image. Thousands of scientists, including landscape ecologists, global change biologists, and habitat specialists have relied on these valuable satellite-based NDVI images for decades. There are public sources of infrared photography for the US available through the Department of Agriculture -- NAIP and Vegscape -- but this imagery is not collected when, as often, or at useable scale for individuals who are managing small plots.
Caption: Normal color photo (top) and normalized difference vegetation index (NDVI) image. NDVI image was derived from two color channels in a single photo taken with a camera modified with a special infrared filter. Note that tree trunks, brown grass, and rocks have very low NDVI values because they are not photosynthetic. Healthy plants typically have NDVI values between 0.1 and 0.9. Images by Chris Fastie. Visit the gallery of high-res images by Chris Fastie How to process your images:We're working on an easy process to generate composite, infrared + visible images that will reveal new details of plant health and photosynthesis. There are several approaches:
Note: Older versions of this page have been kept at the following wiki page: http://publiclab.org/wiki/near-infrared-camera-history |
Revert | |
94 | warren |
October 10, 2013 14:07
| about 11 years ago
The Infragram Kickstarter video, a great introduction to the project. IntroductionVineyards, large farms, and NASA all use near-infrared photography for assessing plant health, usually by mounting expensive sensors on airplanes and satellites. At Public Lab, we've developed a Do-It-Yourself way to take these kinds of photos, enabling us to monitor our environment through quantifiable data. Our technique uses a modified digital camera to capture near-infrared and blue light in the same image, but in different color channels. We then post-process the image to attempt to infer how much it is photosynthesizing. This allows us to better understand and quantify how much of the available light plants are metabolizing into sugar via photosynthesis.
We recently ran a Kickstarter for a version of this camera we call the Infragram. Read more about it here » Here's the video from the Kickstarter, which offers a nice visual explanation of the technique: What is it good for?Multispectral or infrared/visible photography has seen a variety of applications in the decades since it was developed. We have focused on the following uses:
Notable uses include this photograph of an unidentified plume of material in the Gowanus Canal (and writeup by TechPresident) and a variety of projects at a small farm in New Hampshire at the annual iFarm event. The Louisiana Universities Marine Consortium has also collaborated with Public Lab contributors to measure wetlands loss following the Deepwater Horizon oil disaster. Here's an example of what one of our "Infragram" cameras sees (left) and the post-processing analysis which shows photosynthetic activity, or plant health (right). This picture was taken from a commercial airplane flight: How does it work?Camera modification: We've worked on several different techniques, from dual camera systems to the current, single-camera technique. This involves removing the infrared-blocking filter from almost any digital camera, and adding a specific blue filter. This filters out the red light, and measures infrared light in its place using a piece of carefully chosen "NGB" or "infrablue" filter. Read more about the development of this technique here. You can also learn more about how digital camera image sensors detect colors at this great tutorial by Bigshot. Post-processing: Once you take a multispectral photograph with a modified camera, you must post-process it, compositing the infrared and visible data to generate a new image which (if it works) displays healthy, photosynthetically active areas as bright regions. An in-depth article on the technique by Chris Fastie (albeit using red instead of blue for visible light) can be found here. History of the project: While we used to use a two-camera system, research by Chris Fastie and other Public Lab contributors have led to the use of a single camera which can image in both infrared and visible light simultaneously. The Infrablue filter is just a piece of carefully chosen theater gel which was examined using a DIY spectrometer. You can use this filter to turn most webcams or cheap point-and-shoots into an infrared/visible camera. Background: satellite infrared imagingThe study of Earth's environment from space got its start in 1972 when the first Landsat satellite was launched. The multispectral scanner it carried, like the scanners on all subsequent Landsat satellites, recorded images with both visible and near infrared light. Remote sensing scientists quickly learned that by combining visible and infrared data, they could reveal critical information about the health of vegetation. For example, the normalized difference vegetation index (NDVI) highlights the difference between the red and infrared wavelengths that are reflected from vegetation. Because red light is used by plants for photosynthesis but infrared light is not, NDVI allows scientists to estimate the amount of healthy foliage in every satellite image. Thousands of scientists, including landscape ecologists, global change biologists, and habitat specialists have relied on these valuable satellite-based NDVI images for decades. There are public sources of infrared photography for the US available through the Department of Agriculture -- NAIP and Vegscape -- but this imagery is not collected when, as often, or at useable scale for individuals who are managing small plots.
Caption: Normal color photo (top) and normalized difference vegetation index (NDVI) image. NDVI image was derived from two color channels in a single photo taken with a camera modified with a special infrared filter. Note that tree trunks, brown grass, and rocks have very low NDVI values because they are not photosynthetic. Healthy plants typically have NDVI values between 0.1 and 0.9. Images by Chris Fastie. Visit the gallery of high-res images by Chris Fastie How to process your images:We're working on an easy process to generate composite, infrared + visible images that will reveal new details of plant health and photosynthesis. There are several approaches:
Note: Older versions of this page have been kept at the following wiki page: http://publiclab.org/wiki/near-infrared-camera-history |
Revert | |
93 | warren |
August 21, 2013 18:34
| over 11 years ago
IntroductionVineyards, large farms, and NASA all use near-infrared photography for assessing plant health, usually by mounting expensive sensors on airplanes and satellites. At Public Lab, we've developed a Do-It-Yourself way to take these kinds of photos, enabling us to monitor our environment through quantifiable data. Our technique uses a modified digital camera to capture near-infrared and blue light in the same image, but in different color channels. We then post-process the image to attempt to infer how much it is photosynthesizing. This allows us to better understand and quantify how much of the available light plants are metabolizing into sugar via photosynthesis.
We recently ran a Kickstarter for a version of this camera we call the Infragram. Read more about it here » Here's the video from the Kickstarter, which offers a nice visual explanation of the technique: What is it good for?Multispectral or infrared/visible photography has seen a variety of applications in the decades since it was developed. We have focused on the following uses:
Notable uses include this photograph of an unidentified plume of material in the Gowanus Canal (and writeup by TechPresident) and a variety of projects at a small farm in New Hampshire at the annual iFarm event. The Louisiana Universities Marine Consortium has also collaborated with Public Lab contributors to measure wetlands loss following the Deepwater Horizon oil disaster. Here's an example of what one of our "Infragram" cameras sees (left) and the post-processing analysis which shows photosynthetic activity, or plant health (right). This picture was taken from a commercial airplane flight: How does it work?Camera modification: We've worked on several different techniques, from dual camera systems to the current, single-camera technique. This involves removing the infrared-blocking filter from almost any digital camera, and adding a specific blue filter. This filters out the red light, and measures infrared light in its place using a piece of carefully chosen "NGB" or "infrablue" filter. Read more about the development of this technique here. You can also learn more about how digital camera image sensors detect colors at this great tutorial by Bigshot. Post-processing: Once you take a multispectral photograph with a modified camera, you must post-process it, compositing the infrared and visible data to generate a new image which (if it works) displays healthy, photosynthetically active areas as bright regions. An in-depth article on the technique by Chris Fastie (albeit using red instead of blue for visible light) can be found here. History of the project: While we used to use a two-camera system, research by Chris Fastie and other Public Lab contributors have led to the use of a single camera which can image in both infrared and visible light simultaneously. The Infrablue filter is just a piece of carefully chosen theater gel which was examined using a DIY spectrometer. You can use this filter to turn most webcams or cheap point-and-shoots into an infrared/visible camera. Background: satellite infrared imagingThe study of Earth's environment from space got its start in 1972 when the first Landsat satellite was launched. The multispectral scanner it carried, like the scanners on all subsequent Landsat satellites, recorded images with both visible and near infrared light. Remote sensing scientists quickly learned that by combining visible and infrared data, they could reveal critical information about the health of vegetation. For example, the normalized difference vegetation index (NDVI) highlights the difference between the red and infrared wavelengths that are reflected from vegetation. Because red light is used by plants for photosynthesis but infrared light is not, NDVI allows scientists to estimate the amount of healthy foliage in every satellite image. Thousands of scientists, including landscape ecologists, global change biologists, and habitat specialists have relied on these valuable satellite-based NDVI images for decades. There are public sources of infrared photography for the US available through the Department of Agriculture -- NAIP and Vegscape -- but this imagery is not collected when, as often, or at useable scale for individuals who are managing small plots.
Caption: Normal color photo (top) and normalized difference vegetation index (NDVI) image. NDVI image was derived from two color channels in a single photo taken with a camera modified with a special infrared filter. Note that tree trunks, brown grass, and rocks have very low NDVI values because they are not photosynthetic. Healthy plants typically have NDVI values between 0.1 and 0.9. Images by Chris Fastie. Visit the gallery of high-res images by Chris Fastie How to process your images:We're working on an easy process to generate composite, infrared + visible images that will reveal new details of plant health and photosynthesis. There are several approaches:
Note: Older versions of this page have been kept at the following wiki page: http://publiclab.org/wiki/near-infrared-camera-history |
Revert | |
92 | warren |
August 14, 2013 17:06
| over 11 years ago
IntroductionVineyards, large farms, and NASA all use near-infrared photography for assessing plant health, usually by mounting expensive sensors on airplanes and satellites. At Public Lab, we've developed a Do-It-Yourself way to bring this technology to everyday people, enabling us to monitor our environment through quantifiable data. We are currently running a Kickstarter for a version of this camera we call the Infragram. Read more about it here » What is it good for?
Here's an example of what one of our "Infragram" cameras sees (left) and the post-processing analysis which shows photosynthetic activity, or plant health (right). This picture was taken from a commercial airplane flight: Background: satellite infrared imagingThe study of Earth's environment from space got its start in 1972 when the first Landsat satellite was launched. The multispectral scanner it carried, like the scanners on all subsequent Landsat satellites, recorded images with both visible and near infrared light. Remote sensing scientists quickly learned that by combining visible and infrared data, they could reveal critical information about the health of vegetation. For example, the normalized difference vegetation index (NDVI) highlights the difference between the red and infrared wavelengths that are reflected from vegetation. Because red light is used by plants for photosynthesis but infrared light is not, NDVI allows scientists to estimate the amount of healthy foliage in every satellite image. Thousands of scientists, including landscape ecologists, global change biologists, and habitat specialists have relied on these valuable satellite-based NDVI images for decades. There are public sources of infrared photography for the US available through the Department of Agriculture -- NAIP and Vegscape -- but this imagery is not collected when, as often, or at useable scale for individuals who are managing small plots.
Caption: Normal color photo (top) and normalized difference vegetation index (NDVI) image. NDVI image was derived from two color channels in a single photo taken with a camera modified with a special infrared filter. Note that tree trunks, brown grass, and rocks have very low NDVI values because they are not photosynthetic. Healthy plants typically have NDVI values between 0.1 and 0.9. Images by Chris Fastie. Visit the gallery of high-res images by Chris Fastie Point & shoot infrared photographyThe goal of Public Lab's Infragram project is to bring the power of NDVI and other infrared vegetation imaging back to earth where everyone can now take close-up images of plants or landscapes and instantly learn about their health and vigor. Chris Fastie's infrared/visible camera prototype We are able to tweak a single camera to capture near-infrared, green, and blue light. This allows us to try to understand and quantify how much of the available light plants are metabolizing into sugar via photosynthesis. We do this by filtering out the red light, and reading infrared in its place using a piece of carefully chosen "NGB" filter. Read more about the development of this technique here. You can also learn more about how digital camera image sensors detect colors at this great tutorial by Bigshot. How we do itBasically, we remove the infrared blocking filter from a conventional digital camera and replace it with a carefully chosen "infrablue" filter. This lets the camera read infrared and visible light at the same time, but in different color channels. While we used to use a two-camera system, research by Chris Fastie and other Public Lab contributors have led to the use of a single camera which can image in both infrared and visible light simultaneously. The Infrablue filter is just a piece of carefully chosen theater gel which was examined using a DIY spectrometer. You can use this filter to turn most webcams or cheap point-and-shoots into an infrared/visible camera. How to process your images:We're working on an easy process to generate composite, infrared + visible images that will reveal new details of plant health and photosynthesis. There are several approaches:
Note: Older versions of this page have been kept at the following wiki page: http://publiclab.org/wiki/near-infrared-camera-history |
Revert | |
91 | warren |
August 06, 2013 21:18
| over 11 years ago
IntroductionVineyards, large farms, and NASA all use near-infrared photography for assessing plant health, usually by mounting expensive sensors on airplanes and satellites. At Public Lab, we've developed a Do-It-Yourself way to bring this technology to everyday people, enabling us to monitor our environment through quantifiable data. We are currently running a Kickstarter for a version of this camera we call the Infragram. Read more about it here » What is it good for?
Here's an example of what one of our "Infragram" cameras sees (left) and the post-processing analysis which shows photosynthetic activity, or plant health (right). This picture was taken from a commercial airplane flight: Background: satellite infrared imagingThe study of Earth's environment from space got its start in 1972 when the first Landsat satellite was launched. The multispectral scanner it carried, like the scanners on all subsequent Landsat satellites, recorded images with both visible and near infrared light. Remote sensing scientists quickly learned that by combining visible and infrared data, they could reveal critical information about the health of vegetation. For example, the normalized difference vegetation index (NDVI) highlights the difference between the red and infrared wavelengths that are reflected from vegetation. Because red light is used by plants for photosynthesis but infrared light is not, NDVI allows scientists to estimate the amount of healthy foliage in every satellite image. Thousands of scientists, including landscape ecologists, global change biologists, and habitat specialists have relied on these valuable satellite-based NDVI images for decades. There are public sources of infrared photography for the US available through the Department of Agriculture -- NAIP and Vegscape -- but this imagery is not collected when, as often, or at useable scale for individuals who are managing small plots.
Caption: Normal color photo (top) and normalized difference vegetation index (NDVI) image. NDVI image was derived from two color channels in a single photo taken with a camera modified with a special infrared filter. Note that tree trunks, brown grass, and rocks have very low NDVI values because they are not photosynthetic. Healthy plants typically have NDVI values between 0.1 and 0.9. Images by Chris Fastie. Visit the gallery of high-res images by Chris Fastie Point & shoot infrared photographyThe goal of Public Lab's Infragram project is to bring the power of NDVI and other infrared vegetation imaging back to earth where everyone can now take close-up images of plants or landscapes and instantly learn about their health and vigor. Chris Fastie's infrared/visible camera prototype We are able to tweak a single camera to capture near-infrared, green, and blue light. This allows us to try to understand and quantify how much of the available light plants are metabolizing into sugar via photosynthesis. We do this by filtering out the red light, and reading infrared in its place using a piece of carefully chosen "NGB" filter. Read more about the development of this technique here. You can also learn more about how digital camera image sensors detect colors at this great tutorial by Bigshot. How we do itBasically, we remove the infrared blocking filter from a conventional digital camera and replace it with a carefully chosen "infrablue" filter. This lets the camera read infrared and visible light at the same time, but in different color channels. While we used to use a two-camera system, research by Chris Fastie and other Public Lab contributors have led to the use of a single camera which can image in both infrared and visible light simultaneously. The Infrablue filter is just a piece of carefully chosen theater gel which was examined using a DIY spectrometer. You can use this filter to turn most webcams or cheap point-and-shoots into an infrared/visible camera. How to process your images:We're working on an easy process to generate composite, infrared + visible images that will reveal new details of plant health and photosynthesis. Currently there are several approaches:
Processing overviewWe're working on a cleaner, easier way to process images, and hope to have a web app up soon which will work something like this:
Note: Older versions of this page have been kept at the following wiki page: http://publiclab.org/wiki/near-infrared-camera-history |
Revert | |
90 | cfastie |
July 20, 2013 17:17
| over 11 years ago
IntroductionVineyards, large farms, and NASA all use near-infrared photography for assessing plant health, usually by mounting expensive sensors on airplanes and satellites. At Public Lab, we've developed a Do-It-Yourself way to bring this technology to everyday people, enabling us to monitor our environment through quantifiable data. We are currently running a Kickstarter for a version of this camera we call the Infragram. Read more about it here » What is it good for?
Here's an example of what one of our "Infragram" cameras sees (left) and the post-processing analysis which shows photosynthetic activity, or plant health (right). This picture was taken from a commercial airplane flight: Background: satellite infrared imagingThe study of Earth's environment from space got its start in 1972 when the first Landsat satellite was launched. The multispectral scanner it carried, like the scanners on all subsequent Landsat satellites, recorded images with both visible and near infrared light. Remote sensing scientists quickly learned that by combining visible and infrared data, they could reveal critical information about the health of vegetation. For example, the normalized difference vegetation index (NDVI) highlights the difference between the red and infrared wavelengths that are reflected from vegetation. Because red light is used by plants for photosynthesis but infrared light is not, NDVI allows scientists to estimate the amount of healthy foliage in every satellite image. Thousands of scientists, including landscape ecologists, global change biologists, and habitat specialists have relied on these valuable satellite-based NDVI images for decades. There are public sources of infrared photography for the US available through the Department of Agriculture -- NAIP and Vegscape -- but this imagery is not collected when, as often, or at useable scale for individuals who are managing small plots.
Caption: Normal color photo (top) and normalized difference vegetation index (NDVI) image. NDVI image was derived from two color channels in a single photo taken with a camera modified with a special infrared filter. Note that tree trunks, brown grass, and rocks have very low NDVI values because they are not photosynthetic. Healthy plants typically have NDVI values between 0.1 and 0.9. Images by Chris Fastie. Visit the gallery of high-res images by Chris Fastie Point & shoot infrared photographyThe goal of Public Lab's Infragram project is to bring the power of NDVI and other infrared vegetation imaging back to earth where everyone can now take close-up images of plants or landscapes and instantly learn about their health and vigor. Chris Fastie's infrared/visible camera prototype We are able to tweak a single camera to capture near-infrared, green, and blue light. This allows us to photograph the "secret life of plants". We do this by filtering out the red light, and reading infrared in its place using a piece of carefully chosen "NGB" filter. Read more about the development of this technique here. How we do itBasically, we remove the infrared blocking filter from a conventional digital camera and replace it with a carefully chosen "infrablue" filter. This lets the camera read infrared and visible light at the same time, but in different color channels. While we used to use a two-camera system, research by Chris Fastie and other Public Lab contributors have led to the use of a single camera which can image in both infrared and visible light simultaneously. The Infrablue filter is just a piece of carefully chosen theater gel which was examined using a DIY spectrometer. You can use this filter to turn most webcams or cheap point-and-shoots into an infrared/visible camera. How to process your images:We're working on an easy process to generate composite, infrared + visible images that will reveal new details of plant health and photosynthesis. Currently there are several approaches:
Processing overviewWe're working on a cleaner, easier way to process images, and hope to have a web app up soon which will work something like this:
Note: Older versions of this page have been kept at the following wiki page: http://publiclab.org/wiki/near-infrared-camera-history |
Revert | |
89 | warren |
July 02, 2013 23:32
| over 11 years ago
IntroductionVineyards, large farms, and NASA all use near-infrared photography for assessing plant health, usually by mounting expensive sensors on airplanes and satellites. At Public Lab, we've developed a Do-It-Yourself way to bring this technology to everyday people, enabling us to monitor our environment through quantifiable data. We are currently running a Kickstarter for a version of this camera we call the Infragram. Read more about it here » What is it good for?
Here's an example of what one of our "Infragram" cameras sees (left) and the post-processing analysis which shows photosynthetic activity, or plant health (right). This picture was taken from a commercial airplane flight: Background: satellite infrared imagingThe study of Earth's environment from space got its start in 1972 when the first Landsat satellite was launched. The multispectral scanner it carried, like the scanners on all subsequent Landsat satellites, recorded images with both visible and near infrared light. Remote sensing scientists quickly learned that by combining visible and infrared data, they could reveal critical information about the health of vegetation. For example, the normalized difference vegetation index (NDVI) highlights the difference between the red and infrared wavelengths that are reflected from vegetation. Because red light is used by plants for photosynthesis but infrared light is not, NDVI allows scientists to estimate the amount of healthy foliage in every satellite image. Thousands of scientists, including landscape ecologists, global change biologists, and habitat specialists have relied on these valuable satellite-based NDVI images for decades. There are public sources of infrared photography for the US available through the Department of Agriculture -- NAIP and Vegscape -- but this imagery is not collected when, as often, or at useable scale for individuals who are managing small plots.
Caption: Normal color photo (top) and normalized difference vegetation index (NDVI) image. NDVI image was derived from two color channels in a single photo taken with a camera modified with a special infrared filter. Note that tree trunks, brown grass, and rocks have very low NDVI values because they are not photosynthetic. Healthy plants typically have NDVI values between 0.1 and 0.9. Images by Chris Fastie. Visit the gallery of high-res images by Chris Fastie Point & shoot infrared photographyThe goal of Public Lab's Infragram project is to bring the power of NDVI and other infrared vegetation imaging back to earth where everyone can now take close-up images of plants or landscapes and instantly learn about their health and vigor. Chris Fastie's infrared/visible camera prototype We are able to tweak a single camera to capture near-infrared, green, and blue light. This allows us to photograph the "secret life of plants". We do this by filtering out the red light, and reading infrared in its place using a piece of carefully chosen "NGB" filter. Read more about the development of this technique here. How we do itBasically, we remove the infrared blocking filter from a conventional digital camera and replace it with a carefully chosen "infrablue" filter. This lets the camera read infrared and visible light at the same time, but in different color channels. While we used to use a two-camera system, research by Chris Fastie and other Public Lab contributors have led to the use of a single camera which can image in both infrared and visible light simultaneously. The Infrablue filter is just a piece of carefully chosen theater gel which was examined using a DIY spectrometer. You can use this filter to turn most webcams or cheap point-and-shoots into an infrared/visible camera. How to process your images:We're working on an easy process to generate composite, infrared + visible images that will reveal new details of plant health and photosynthesis. Currently there are several approaches:
Processing overviewWe're working on a cleaner, easier way to process images, and hope to have a web app up soon which will work something like this:
Note: Older versions of this page have been kept at the following wiki page: http://publiclab.org/wiki/near-infrared-camera-history |
Revert | |
88 | warren |
July 02, 2013 23:30
| over 11 years ago
IntroductionVineyards, large farms, and NASA all use near-infrared photography for assessing plant health, usually by mounting expensive sensors on airplanes and satellites. At Public Lab, we've developed a Do-It-Yourself way to bring this technology to everyday people, enabling us to monitor our environment through quantifiable data. We are currently running a Kickstarter for a version of this camera we call the Infragram. Read more about it here » What is it good for?
Here's an example of what one of our "Infragram" cameras sees (left) and the post-processing analysis which shows photosynthetic activity, or plant health (right). This picture was taken from a commercial airplane flight: Background: satellite infrared imagingThe study of Earth's environment from space got its start in 1972 when the first Landsat satellite was launched. The multispectral scanner it carried, like the scanners on all subsequent Landsat satellites, recorded images with both visible and near infrared light. Remote sensing scientists quickly learned that by combining visible and infrared data, they could reveal critical information about the health of vegetation. For example, the normalized difference vegetation index (NDVI) highlights the difference between the red and infrared wavelengths that are reflected from vegetation. Because red light is used by plants for photosynthesis but infrared light is not, NDVI allows scientists to estimate the amount of healthy foliage in every satellite image. Thousands of scientists, including landscape ecologists, global change biologists, and habitat specialists have relied on these valuable satellite-based NDVI images for decades. There are public sources of infrared photography for the US available through the Department of Agriculture -- NAIP and Vegscape -- but this imagery is not collected when, as often, or at useable scale for individuals who are managing small plots.
Caption: Normal color photo (top) and normalized difference vegetation index (NDVI) image. NDVI image was derived from two color channels in a single photo taken with a camera modified with a special infrared filter. Note that tree trunks, brown grass, and rocks have very low NDVI values because they are not photosynthetic. Healthy plants typically have NDVI values between 0.1 and 0.9. Images by Chris Fastie. Visit the gallery of high-res images by Chris Fastie Point & shoot infrared photographyThe goal of Public Lab's Infragram project is to bring the power of NDVI and other infrared vegetation imaging back to earth where everyone can now take close-up images of plants or landscapes and instantly learn about their health and vigor. Chris Fastie's infrared/visible camera prototype We are able to tweak a single camera to capture near-infrared, green, and blue light. This allows us to photograph the "secret life of plants". We do this by filtering out the red light, and reading infrared in its place using a piece of carefully chosen "NGB" filter. Read more about the development of this technique here. How we do itBasically, we remove the infrared blocking filter from a conventional digital camera and replace it with a carefully chosen "infrablue" filter. This lets the camera read infrared and visible light at the same time, but in different color channels. While we used to use a two-camera system, research by Chris Fastie and other Public Lab contributors have led to the use of a single camera which can image in both infrared and visible light simultaneously. The Infrablue filter is just a piece of carefully chosen theater gel which was examined using a DIY spectrometer. You can use this filter to turn most webcams or cheap point-and-shoots into an infrared/visible camera. How to process your images:We're working on an easy process to generate composite, infrared + visible images that will reveal new details of plant health and photosynthesis. Currently there are several approaches:
Processing overviewWe're working on a cleaner, easier way to process images, and hope to have a web app up soon which will work something like this:
Note: Older versions of this page have been kept at the following wiki page: http://publiclab.org/wiki/near-infrared-camera-history |
Revert | |
87 | warren |
July 02, 2013 23:27
| over 11 years ago
IntroductionVineyards, large farms, and NASA all use near-infrared photography for assessing plant health, usually by mounting expensive sensors on airplanes and satellites. At Public Lab, we've developed a Do-It-Yourself way to bring this technology to everyday people, enabling us to monitor our environment through quantifiable data. We are currently running a Kickstarter for a version of this camera we call the Infragram. Read more about it here » What is it good for?
Here's an example of what one of our "Infragram" cameras sees (left) and the post-processing analysis which shows photosynthetic activity, or plant health (right). This picture was taken from a commercial airplane flight: Background: satellite infrared imagingThe study of Earth's environment from space got its start in 1972 when the first Landsat satellite was launched. The multispectral scanner it carried, like the scanners on all subsequent Landsat satellites, recorded images with both visible and near infrared light. Remote sensing scientists quickly learned that by combining visible and infrared data, they could reveal critical information about the health of vegetation. For example, the normalized difference vegetation index (NDVI) highlights the difference between the red and infrared wavelengths that are reflected from vegetation. Because red light is used by plants for photosynthesis but infrared light is not, NDVI allows scientists to estimate the amount of healthy foliage in every satellite image. Thousands of scientists, including landscape ecologists, global change biologists, and habitat specialists have relied on these valuable satellite-based NDVI images for decades. There are public sources of infrared photography for the US available through the Department of Agriculture -- NAIP and Vegscape -- but this imagery is not collected when, as often, or at useable scale for individuals who are managing small plots.
Caption: Normal color photo (top) and normalized difference vegetation index (NDVI) image. NDVI image was derived from two color channels in a single photo taken with a camera modified with a special infrared filter. Note that tree trunks, brown grass, and rocks have very low NDVI values because they are not photosynthetic. Healthy plants typically have NDVI values between 0.1 and 0.9. Images by Chris Fastie. Visit the gallery of high-res images by Chris Fastie Point & shoot infrared photographyThe goal of Public Lab's Infragram project is to bring the power of NDVI and other infrared vegetation imaging back to earth where everyone can now take close-up images of plants or landscapes and instantly learn about their health and vigor. Chris Fastie's infrared/visible camera prototype We are able to tweak a single camera to capture near-infrared, green, and blue light. This allows us to photograph the "secret life of plants". We do this by filtering out the red light, and reading infrared in its place using a piece of carefully chosen "NGB" filter. Read more about the development of this technique here. How we do itResearch by Chris Fastie and other Public Lab contributors have led to the use of a single camera which can image in both infrared and visible light simultaneously. The filter is just a piece of carefully chosen theater gel which was examined using a DIY spectrometer. You can use this filter to turn your webcam or cheap point-and-shoot into an infrared camera. How to process your images:We're working on an easy process to generate composite, infrared + visible images that will reveal new details of plant health and photosynthesis. Currently there are several approaches:
Processing overviewWe're working on a cleaner, easier way to process images, and hope to have a web app up soon which will work something like this:
Note: Older versions of this page have been kept at the following wiki page: http://publiclab.org/wiki/near-infrared-camera-history |
Revert | |
86 | warren |
July 02, 2013 23:27
| over 11 years ago
IntroductionVineyards, large farms, and NASA all use near-infrared photography for assessing plant health, usually by mounting expensive sensors on airplanes and satellites. At Public Lab, we've developed a Do-It-Yourself way to bring this technology to everyday people, enabling us to monitor our environment through quantifiable data. We are currently running a Kickstarter for a version of this camera we call the Infragram. Read more about it here » What is it good for?
Here's an example of what one of our "Infragram" cameras sees (left) and the post-processing analysis which shows photosynthetic activity, or plant health (right). This picture was taken from a commercial airplane flight: Background: satellite infrared imagingThe study of Earth's environment from space got its start in 1972 when the first Landsat satellite was launched. The multispectral scanner it carried, like the scanners on all subsequent Landsat satellites, recorded images with both visible and near infrared light. Remote sensing scientists quickly learned that by combining visible and infrared data, they could reveal critical information about the health of vegetation. For example, the normalized difference vegetation index (NDVI) highlights the difference between the red and infrared wavelengths that are reflected from vegetation. Because red light is used by plants for photosynthesis but infrared light is not, NDVI allows scientists to estimate the amount of healthy foliage in every satellite image. Thousands of scientists, including landscape ecologists, global change biologists, and habitat specialists have relied on these valuable satellite-based NDVI images for decades. There are public sources of infrared photography for the US available through the Department of Agriculture -- NAIP and Vegscape -- but this imagery is not collected when, as often, or at useable scale for individuals who are managing small plots.
Caption: Normal color photo (top) and normalized difference vegetation index (NDVI) image. NDVI image was derived from two color channels in a single photo taken with a camera modified with a special infrared filter. Note that tree trunks, brown grass, and rocks have very low NDVI values because they are not photosynthetic. Healthy plants typically have NDVI values between 0.1 and 0.9. Images by Chris Fastie. Visit the gallery of high-res images by Chris Fastie Point & shoot infrared photographyThe goal of Public Lab's Infragram project is to bring the power of NDVI and other infrared vegetation imaging back to earth where everyone can now take close-up images of plants or landscapes and instantly learn about their health and vigor. Chris Fastie's infrared/visible camera prototype We are able to tweak a single camera to capture near-infrared, green, and blue light. This allows us to photograph the "secret life of plants". We do this by filtering out the red light, and reading infrared in its place using a piece of carefully chosen "NGB" filter. Read more about the development of this technique here. How we do itResearch by Chris Fastie and other Public Lab contributors have led to the use of a single camera which can image in both infrared and visible light simultaneously. The filter is just a piece of carefully chosen theater gel which was examined using a DIY spectrometer. You can use this filter to turn your webcam or cheap point-and-shoot into an infrared camera. How to process your images:We're working on an easy process to generate composite, infrared + visible images that will reveal new details of plant health and photosynthesis. Currently there are several approaches:
Processing overviewWe're working on a cleaner, easier way to process images, and hope to have a web app up soon which will work something like this:
Note: Older versions of this page have been kept at the following wiki page: http://publiclab.org/wiki/near-infrared-camera-history |
Revert | |
85 | warren |
July 02, 2013 23:26
| over 11 years ago
IntroductionVineyards, large farms, and NASA all use near-infrared photography for assessing plant health, usually by mounting expensive sensors on airplanes and satellites. At Public Lab, we've developed a Do-It-Yourself way to bring this technology to everyday people, enabling us to monitor our environment through quantifiable data. We are currently running a Kickstarter for a version of this camera we call the Infragram. Read more about it here » What is it good for?
Here's an example of what one of our "Infragram" cameras sees (left) and the post-processing analysis which shows photosynthetic activity, or plant health (right). This picture was taken from a commercial airplane flight: Background: satellite infrared imagingThe study of Earth's environment from space got its start in 1972 when the first Landsat satellite was launched. The multispectral scanner it carried, like the scanners on all subsequent Landsat satellites, recorded images with both visible and near infrared light. Remote sensing scientists quickly learned that by combining visible and infrared data, they could reveal critical information about the health of vegetation. For example, the normalized difference vegetation index (NDVI) highlights the difference between the red and infrared wavelengths that are reflected from vegetation. Because red light is used by plants for photosynthesis but infrared light is not, NDVI allows scientists to estimate the amount of healthy foliage in every satellite image. Thousands of scientists, including landscape ecologists, global change biologists, and habitat specialists have relied on these valuable satellite-based NDVI images for decades. There are public sources of infrared photography for the US available through the Department of Agriculture -- NAIP and Vegscape -- but this imagery is not collected when, as often, or at useable scale for individuals who are managing small plots.
Caption: Normal color photo (top) and normalized difference vegetation index (NDVI) image. NDVI image was derived from two color channels in a single photo taken with a camera modified with a special infrared filter. Note that tree trunks, brown grass, and rocks have very low NDVI values because they are not photosynthetic. Healthy plants typically have NDVI values between 0.1 and 0.9. Images by Chris Fastie. Visit the gallery of high-res images by Chris Fastie Point & shoot infrared photographyThe goal of Public Lab's Infragram project is to bring the power of NDVI and other infrared vegetation imaging back to earth where everyone can now take close-up images of plants or landscapes and instantly learn about their health and vigor. Chris Fastie's infrared/visible camera prototype We are able to tweak a single camera to capture near-infrared, green, and blue light. This allows us to photograph the "secret life of plants". We do this by filtering out the red light, and reading infrared in its place using a piece of carefully chosen "NGB" filter. Read more about the development of this technique here. How we do itResearch by Chris Fastie and other Public Lab contributors have led to the use of a single camera which can image in both infrared and visible light simultaneously. The filter is just a piece of carefully chosen theater gel which was examined using a DIY spectrometer. You can use this filter to turn your webcam or cheap point-and-shoot into an infrared camera. How to process your images:We're working on an easy process to generate composite, infrared + visible images that will reveal new details of plant health and photosynthesis. Currently there are several approaches:
Processing overviewWe're working on a cleaner, easier way to process images, and hope to have a web app up soon which will work something like this:
Note: Older versions of this page have been kept at the following wiki page: http://publiclab.org/wiki/near-infrared-camera-history |
Revert |