Question: Will Infragram help with Indoor Hydroponics/Aquaponics?

ajawitz is asking a question about infragram
Follow this topic

by ajawitz | January 15, 2014 17:51 | #9946


What I want to do

Monitor and control optimal light conditions for indoor plant growth using Infrared imagery and LED lights controller.

Hydroponics, Aquaponics and other high yield, indoor cultivation techniques are becoming increasingly popular in urban areas and northern regions with short growing seasons. "DIY-R&D" efforts like those of our.windowfarms.org and HAPI in Ohio are two of the more successful efforts marrying the indoor cultivation techniques with DIY technology.

Lighting has always been the trickiest (and costliest) issue facing indoor growers though advances in LED technology may be on the verge of significantly reducing this barrier. Cutting-edge artificial lighting techniques known as "pinkhousing" operate under the theory that plant growth can be more effectively, and efficiently managed through various subtle combinations of red and blue lighting (and by eliminating green altogether). LED technology allows for exactly such lighting interactions to be digitally programmed according to each plants specific needs.

The problem is that so far, very little is known about what these needs are. This is where an infrared imaging tool like Infragram could come in real handy. One could envision programming a script that interprets visual imagery from Infragram into a precise combination of Red-Blue LED lighting that responds to a specific area. Or at the very least it would enable indoor farmers to have a better grasp of lighting effects on plant health.

One immediate issue is related to the method used by Infragram-compatible cameras in the first place. If the imaging process is based on capturing reflected IR rays then would it be nullified by the "Red/Blue Spectrum only" technique? Would it be possible to replicate the IR reflection by introducing artificial IR LEDs? Or would the number of LEDs required for such a process render it unfeasible?

1st Attempt and Results

I couldnt wait to start testing this hypothesis so I ordered the DIY Infrablue Filter pack from the PublicLab Store to make my own camera conversions, while also ordering the new PiNoir Webcam for the Raspberry Pi. As of the present writing (1/20/14 I have had the filter pack for about a week and am still waiting for the PiNOIR to arrive.
I actually have the same exact webcam that appears on the main PubicLab IR Cam wiki infragram2_610x259.png. So I assumed this would be the logical choice for modding as I assumed the process would be well-documented. Unfortunately, the only other information I could find pertaining to this particular camera was another photo showing the lens with the two different filters- Filter-1_1024x1024.jpg But after removing the blocking filter, I couldn't find any documentation about where to actually install the blue filter. So I instead opted to use a more common Logitech webcam as installation seemed to be more straightforward- InfraCam.jpg

The latter attempt seemed to work well-enough and after testing the cam with infragram.org on my laptop I then connected it to my Nexus 7 through an OTG USB adapter cable and an app called SnapexWebcam (which requires root access).

Thus far, the results have been somewhat inconclusive. Initial images were promising. As the first experiment was performed in the evening, I chose to compare characteristics between two different artificial lighting sources on a planted aquarium. The first image was taken using the standard tank light which I believe is a "ZooMed FloraSun" purchased at a standard pet store-

AquariumInfra1.png

Contrasted with a common Flourescent grow light called the "VitaLume Plus Grow" by "Sun Leaves"-

AquariumInfra2.png

This test yielded clear differences in the green range. However, according to the spectrum reference the exposure was still far below the norm for what a plant would be emitting. Also, while there are observable differences between the images, it is unclear whether this is actually highlighting photosynthesis or simply highlighting differences between IR emissions from the two lighting sources which would reflect off of anything. Daylight tests were even less conclusive. It being winter and somewhat devoid of broad leaf trees outdoors, I tried taking pictures of some houseplants for the next test and the results were somewhat confusing-

HouseDay.png

The parts of the plants where photosynthesis would be expected actually came up closer to the BLUE spectrum so something was clearly off in either the post-processing or the camera itself. Hopefully, its not a problem of "bleeding" between the Red and Blue bands as a hardware fix is much trickier than a simple change in post processing. Hopefully, the PiNOIR will arrive soon as it should provide a much more stable platform. I will be keeping the images in this Picasa Folder for those who are interested.

Questions and next steps

With the PiNOIR I hope to master the process of capturing imagery from standard CFL Grow Lights first. This will provide me with a reference to start experimenting with "Red/Blue" LED lighting and the required amount of supplemental IR LEDS as explained so well by commenter "cfastie" below. My pipe dream is to create real-time controllable LED strips akin to [these commercial offerings] (http://www.elementalled.com/fertilight-led-grow-light-kit.html) only using open source, digital, individually controllable strips like Adafruit's NeoPixel or a high density version of BlinkyTape. The latter even includes a GUI interface allowing the user to adjust the lighting via an RGB slider. Such a function could be added to possible Raspberry Pi/Beaglebone Infragram client to respond to plant health needs in real time.

Update 2/18/14

While its been some time since I've worked this project specifically, I have incorporated it into a much larger Hydroponics Investigation with a master wiki at- http://publiclab.org/wiki/new-concepts-for-oshw-in-indoor-gardening. However, I recently got a hold of two different tools that could possibly yield good results.

  1. I got a hold of the Raspberry Pi NOIR Camera module and

  2. I started working with the TSL2561 Digital Luminosity Sensor available at Adafruit.

tsl2561_LRG.jpg

The TSL2561 is capable of detecting the full spectrum of IR to UV light, and its i2c interface makes it surprisingly easy to use. As I will be using a system similar to the Arduino Yun for my Hydroponics system, my initial tests have been using the Yun-Google Spreadsheet example provided by Temboo.com. However, The Raspberry Pi is still the best platform for incorporating video imagery and if rumors of an RPi App for Infragram turn out to be true, then the TSL2561 might be a powerful addition! As I mentioned eariler, the usage of the sensor is relatively straight-forward, and at less than $5 per, it doesn't get more affordable. The only major obstacle will be making sure the numbers its spitting out make any sense. The basic example sketch provided by Adafruit calculates Lux values, or human readable light. As far as I know, PAR (Photosynthetically active radiation) is a much more accurate metric for vegetation monitoring, and I'm not sure how easy it is to convert the TSL2561 output to reflect PAR instead of Lux. Even without such conversions however, I simple program that logs a sensor reading while capturing Infragram imagery would go a long way towards helping us understand how to adequately provide indoor plants with the lighting they need.



4 Comments

I think you are correct that if the only source of light is blue and red LEDs that emit little NIR, the NIR channel in an Infragram camera is going to be very dark and will not tell you much. Maybe more importantly, the calculation of NDVI uses the NIR channel as sort of a measure of incoming light. If the light source is the sun, the proportion of visible and NIR wavelengths will be somewhat constant, so the more light there is, the more NIR there will be. Because healthy plant leaves reflect almost all of the NIR, the Infragram NIR channel in an image of a leaf is a good proxy for how bright the scene was. However, healthy plant leaves will not reflect all the visible light, and the reduction in a plant image’s visible band compared to the NIR band is the basis for NDVI.

So adding NIR LEDs (or any light with some NIR) would be possible, but you would have to know the proportion of visible to NIR impacting the leaves to know how absorption by healthy plant pigments changed that proportion. So the system would have to be calibrated somehow to compare results with other NDVI results. Or even without calibration, if the light was kept constant you could monitor change through time that was due to changes in plant growth, or relative differences among plant species or treatments. That's really all we have done with Infragram so far anyway. It's even easier to do in a growhouse where the lightsource is absolutely constant.

An Infragram-like technique might be useful for pinkhousing to monitor the proportion of red and blue being emitted by the LEDs and the proportion being absorbed by plants. For example, if the plants are reflecting away most of the red, then maybe you can reduce the number or brightness of red LEDs. This could be done with an unmodified camera and then submitting the photos to the Infragram sandbox for custom processing.

Reply to this comment...


Thank you for this helpful advice! This is exactly what I was hoping to hear! I've edited the research note to include my most recent test results and a more detailed explanation of what I'm hoping to accomplish with the LED RGB lighting. I'm well-aware that its a long shot, or at least a very long term project. But it seems like the emerging field of LED horticulture is ripe for the kind of disruption enabled by open hardware and citizen science. Much remains to be learned in the science of Red/Blue spectra and its effects on plant growth, but I could see a time when each plant comes with its own software profile on Github ;) Want to grow tomatos? Just grab the source code and apply the lighting configuration, ph/nutrient dosing and water flow timing to a Raspberry Pi/Beaglebone (or Arduino TRE if we're talking about the near future...)

Is this a question? Click here to post it to the Questions page.

Reply to this comment...


This is a pretty interesting investigation. You are breaking new ground in a few areas, so there will be much experimentation to do. For example, Infragrams taken through glass and water of aquariums plants under special lights will probably require custom processing and interpretation. Also, CMOS webcams vary a lot from one model to the next in the proportion of RGB and NIR that end up in each color channel (some examples here), so comparing results to previous Infragram NDVI results will require special consideration. The Pi NoIR camera may eventually improve this situation if they ever implement the code to do a custom white balance before shooting photos. Webcams usually do not allow that, so you are at the mercy of the built-in auto white balance algorithm. In one of your photos below, the histogram shows that the blue and red (NIR) channels are almost equally bright for a small patch of plant leaf. If the blue and NIR values for a pixel are similar, computed NDVI will not be very high.

recMon_Jan_20_11_19_56_EST_2014_RGB.jpg

Although white balancing will improve these results, it might be that the blue channel is almost as bright as red (NIR) for leaves because the blue "pixels" are being contaminated with NIR. As you say, hardware problems like that are hard to fix. So reproducing the NDVI values of other devices might be hard, but there is still a lot of plant health information in those images because they record how much NIR is reflecting from leaves. There is a lot of experimentation required to figure out how to get the most plant health information from CMOS Infragram cameras. Classic NDVI might not be the best way to present the results, but we don't yet know what is.

Reply to this comment...


Has there been any movement on the rumored Infragram App for Raspberry Pi? See my recent update about the TSL2561 Digital Luminosity Sensor for something that might make a good addition to such a project. The TSL2561 is compatible with the Raspberry Pi via i2c which means it should only take a simple Python script to combine the sensor readings with the PiNOIR Images. Also, because its 3.3v and compatible with the Raspberry Pi GPIO inputs, neither the camera or sensor would require the USB ports. Thus freeing its use for something else. My biggest issue right now is making any sense of the sensor readings. Perhaps someone with a little more knowledge of things like PAR and Lux could figure it out?

Is this a question? Click here to post it to the Questions page.

Reply to this comment...


Log in to comment