This is a follow-on to my discussion of Spectrometer Stability and is an attempt to observe and compensate for what appears as significant "drift and noise" from the camera. Again, the test configuration has been designed to reduce the source and mechanical noise to zero so as to observe only camera noise. There is evidence of both drift and noise, evidence that the noise is primarily Gaussian and evidence that much of the noise can be diminished through multi-frame averaging. However, drift compensation remains an issue.
When sampling and analyzing drift and noise, it is not always easy to correlate those errors with a specific source. While camera AGC and detector noise remain the most likely causes, there may be other factors yet undiscovered.
In the previous set of stability tests, data was accumulated only one point per minute; a useful overview, but it missed a lot of detail. The general rule for sampling is the Nyquist rate which states that the minimal number of samples to detect a periodic signal at a give frequency is 2x that frequency. (Imagine representing a sine wave by just 2 points; one at the peak and one at the valley.) However, two things result: 1) that sample rate will give a poor representation of the signal's waveform and 2) with non-periodic waveforms it will tell us nothing about what happens between those points.
Given that the camera is capable of 30 frames / sec, I wrote some Matlab code to extract one line of pixel data at ~6 frames/sec and store a total of 15 minutes of raw spectral line data (~10Mb) for later analysis. Again, the same mechanically rigid proto V3 and Solux 4700K lamp were configured exactly the same while collecting the data.
The first plot show 15 min of R/G/B/S ('S' means the (R+G+B)/3 spectrum curve) data with the sample number as the X-axis units. The data is the extraction of the same pixel's value (470, 550 and 620 nm for R/G/B and 550nm for S) as it is recorded over the time period. The Y-axis is the pixel intensity data from the camera.
Note that it is easy to identify both drift and noise in these signals and that R/G/B have different noise levels which do not correlate with their average intensity value. I do not have an explanation for this as yet, The next plot is the same data only just a "zoom-in" to the middle of the above plot so as to see the noise with a bit more detail. Based on the first plot, the same "random" appearance was expected.
First, it would be good to know a bit more about this noise and one simple method is to just plot a distribution:
While these distributions are not exactly the same, they are all similar to the Blue channel which appears reasonably Gaussian. This is helpful because 1) Gaussian noise was expected and 2) averaging the data is a simple and effective method to reduce it's effect. To check this, the Blue channel data was processes with a 31-sample running average and the resulting distribution is plotted below:
As another visualization of the effectiveness of this level of averaging, the R/G/B/S plot was re-generated after averaging on each channel.
These plots, especially the last two, show:
1) Camera noise can be reduced by averaging each pixel (of the selected line of pixels crossing the spectral band) over about 30 frames; roughly 5 sec of recording.
2) Some drift remains but, at least at 550 nm in the combined spectral plot, the error would be reduced to ~+/- 2.5%.
3) Doing no frame data averaging, thus including all the noise, essentially means retaining ~10% error which appears like adding a guess -- a potential for a 10% error each time a "capture" is performed. It's like a "roll of the dice".