We’ll have to keep waiting… Been told about the “incoming” documentation for months…
May not always work, but in this case, there seems to be a fairly straightforward relationship between ch0 counts and pixel values.
@pk123, your experiment results are interesting. However, I have a few questions.
I believe you had to acquire images which had sunshine sensor value (CH0 count) stretched across the range. How did you do that?
You have taken only 1 reflectance panel and mean of digital number for that panel is plotted againt the CH0 of sensor?
I took photos over 5 hours outside. Sequoia on a tripod facing straight down, sunshine sensor facing straight up (i.e. not normal flight conditions)
yes, 1 reflectance panel (75% reflectance), each data point is the average pixel value of the entire reflectance panel.
I should mention that I kept ISO and shutter speed constant throughout.
no, these are the raw pixel values, no adjustment.
Panel was centered and at same location in all shots, so I wouldn’t expect vignetting to affect the trends significantly
@pk123 given the 5hr interval the sun’s elevation would’ve changed. Did you not apply Lambert’s Cosine Law to the CH0 counts to take into account the varying angle?
@seanmcleod, no. Since the sunshine sensor is not facing the sun (it point straight up and it was winter so the sun was never straight above), I wouldn’t expect the Lambert’s Law correction to work. It is possible that because the sunshine sensor receives the sunlight at an angle (same as the ground), it already accounts for the varying sun position during the day. I keep meaning to try this again and see it I get the same relationships.
@pk123 but even if the sun’s elevation was never at 90 degrees to the sunshine sensor over a 5hr period the angle would’ve changed which would change the amount of irradiance measured by the sunshine sensor.
For example assume there was a 10 degree elevation change for the sun from 40 to 50 degrees over the 5hr period, that’s the difference between cos(40) = 0.77 versus cos(50) = 0.64. So I would expect to see a difference in the CH0 counts over the interval.
Now the Sequoia sensor has a GPS and IMUs (one for the sunshine sensor and one for the RGB&Multispectral sensor) so in theory they could work out the sun’s angle relative to the attitude of the sunshine sensor in realtime but I’m pretty sure they don’t. Instead they simply record the attitude info in the IrradianceList tag so that you can use it in post processing.
For example in Pix4d you’ll notice the following info comments during post processing:
Info: cos(sun, irradiance sensor) 0.573237 direct sunlight fraction 0.933341 correction 0.853415
Questions about Radiometric Calibration and data correction
@seanmcleod, i wasn’t saying that the sun angle wasn’t changing nor that the ch0 count didn’t change during the 5h interval. I’m only suggesting that since the sunshine sensor is receiving the sun light at the same angle as the ground is receiving it, it may not be necessary to do an angle adjustment. The linear relationship I observed seem to support that idea.
@pk123 thinking about it a bit more I agree that in your setup both sensors (irradiance sensor and multispectral sensors) are seeing the same scale factor based on Lambert’s Cosine Law.
I had originally only focused on this part of your earlier comments: “Since the sunshine sensor is not facing the sun (it point straight up and it was winter so the sun was never straight above), I wouldn’t expect the Lambert’s Law correction to work.”
The camera itself doesn’t do much in terms of computing and data processing.
When it comes to sun angle calibration, to the best of my knowledge, only Pix4D knows how to do it and actually does it.
see Radiometric Processing and Calibration
Allows the users to calibrate and correct the image reflectance, taking the illumination and sensor influence into consideration. It is possible to choose the type of radiometric correction to be done:
No correction: no radiometric correction will be done
Camera only: corrections will be applied for the parameters that are written in the EXIF metadata and relate to the camera (vignetting, dark current, ISO, etc…).
Camera and Sun Irradiance: corrections will be applied for the camera parameters from the point above as well as for the sun irradiance information written in the XMP.Camera.Irradiance EXIF tag.
Camera, Sun Irradiance and Sun angle: corrections will be applied to take into account the sun position, as well as the camera information and the irradiance data. This option should only be chosen for flights that were done in clear sky conditions.
I don’t know exactly what youexpect from your measurements in terms of accuracy and precision but the sun angle correction seems to be the ultimate step when you want to have dead-on measurements.
We recommend the camera and the sunshine sensors to be rigidly mounted on the drone in order to maximize the accuracy of the measurements. It is best when the two modules have a constant angles between them.
I hope this helps
@clement.fallet: I’m curious, do you think that doing a sun angle correction on the data presented in the graph above would have improved the results?
And I stand by what @domenzain said previously, we are working on a document that would explain how to make sense of the results you get.
I know it’s been a long time but we want to make sure that we/you have all the data to understand how things work.
I do believe it will be worth the wait
@clement.fallet, since I created that topic, I’m not surprised that they are related
The sequoia’s been out for how long now? You don’t have all the data to understand how it works…
In a few words, Pix4D has been a partner of Parrot for a long time and they have access to information that are not shared either.
We are working on a processing model/workflow that would enable you to make sense of the data without going to much into modeling of the sensor itself.
Does it make more sense now?
Maybe this discussion can either be moved to the proper channel (since it doesn’t really concern the irradiance list tags) or privately.
@clement.fallet please don’t make the discussion private, there are a lot of users who have been waiting and been trying to get enough information from Parrot regarding the Sequoia sensor for many months.
@seanmcleod: fair enough, we’ll keep this discussion public.
I have though to warn you that we may not answer all questions regarding Sequoia since some information that you may required may be subject to confidentiality agreements.
As I said previously, some companies have partnerships with Parrot and thus have access to more things that we are allowed to share publicly.
@clement.fallet as a starting point before you even produce the workflow document I think Parrot needs to make very clear asap what users without a confidentiality agreement can expect to achieve with the Sequoia unit once the relevant documentation has been produced.
For example without using Pix4d (or having a confidentiality agreement with Parrot) what sorts of corrections/calibrations can users expect to perform on their multispectral data with and without a ground reflectance target.
Given the marketing and advertising material in terms of the multispectral sensors with an irradiance sensor a lot of users may have quite different expectations compared to the warning you’re now mentioning in terms of “we may not answer all questions regarding Sequoia…”