About Radiometric calibration

#1

Hi!

I’m currently triying to perform radiometric calibration on my Sequoia images

I use the Parrot Sequoia ground-based at 2 meters high from the vegetal cover (without drone).
The image sensor is directed perpendicularly to the surface. The sunshine sensor is directed at the opposite (Nadir).

I’m now able to compute:
-Image sensor Irradiance Isq ( See Parrot Announcement - Release of application notes for more information.

-Sunshine sensor irradiance Iss thanks to this very clear document: https://onedrive.live.com/?authkey=!ACzNLk1ORe37aRQ&cid=C34147D823D8DFEF&id=C34147D823D8DFEF!15414&parId=C34147D823D8DFEF!106&o=OneUp
The methodology is discussed too in topic “Reflectance estimation” Post2 by @domenzain.

I know there is a relation between Isq and Iss, and reflectance R: Rcos(theta)=K*Isq/Iss
In my case, we have Theta=0 so cos(Theta)=1 because we use the sensor perpendicularly to the surface, without angle of view. Is that true? Or Theta means an other angle?

K is a constant that depends on sensors responses.
To obtain K, we can use a calibration target of known reflectance. That’s not a problem for me but there is something I don’t understand. K should be constant, so we should only need to use the calibration target once and not everytime we want to use the Sequoia, right?

One other thing, I read on Micasense tutorial (https://micasense.github.io/imageprocessing/MicaSense%20Image%20Processing%20Tutorial%203.html) that Sun angle and Types of irradiance(direct or diffuse) should be taken into account too for radiometric calibration. Is that really necessary to obtain acceptable results if we use the sunshinesensor? If yes, How to apply these correction?

I saw @pk123 nice experiment in this topic (Details of Irradiance List tag for Sunshine sensor in exif data of Sequoia)


I would just like to know if this linear relation between sunshine and image sensor irradiances still functions when sun is above (zenith) or in cloudy conditions…

Before posting, I’ve read much on this forum and micasense tutorials. If my questions have already been answered somewhere, don’t hesitate to redirect me to them and I’ll close the topic.
If you have questions about the methodology I presented above, don’t hesitate too.

Bye, have a good day

1 Like
#2

hi, did you find the answers to these questions? Can we get in touch?

#3

Hi,

I didn’t find explicit answers to these questions. At the moment I make some hypothesis, I’m currently making a R programm and I will see if it functions experimentally.
If you have a few questions about my methodology, I will try to answer them.
Have a good day

#4

Hi @JBlefebvre,

Sorry for the delay.
Here are some answers to your questions.

You are correct. Depending on the material you image, the quality of the approximation will be better or worse towards the image edges.
A very waxy wheat will have a very specular response as plants go.

As I mentioned in the thread for Reflectance estimation.
That is correct an will mostly hold for several flights under similar light conditions.
In practice, the K factor as I described it also hides away many subtle effects that will bias the measurement including:

  • Blue shift of the bands towards the edges of the images due to inferential filters with off-normal incidence.
  • Out-of-band performance of the actual filters.
  • Temperature of the sensor.

Parrot recommends a calibration with each flight. The Sequoia calibration target uses ArUco tags, which are very straightforward to detect automatically…

Not really necessary for most people (and researchers).
It really depends on what you need in terms of uncertainty.
You can use the FOV specification to get a pretty good approximation for the Theta of each pixel by linearly interpolating from zero at the optical center to half the DFOV at the edge, but it is likely overkill.
If you need extreme precision, be aware that distortion is not linear.
OpenCV has lots of tools to deal with camera distortion, but again you will likely not need to correct for this.

Yes, it does. As long as the same irradiance shines on the irradiance sensor and on the imaged surface element.
If you have scattered clouds, this will not hold (imagine the Sun through the clouds hitting the irradiance sensor at an angle that does not appear in the images).

Best,

#5

Hi @domenzain,

Thank you for your answers!

About target calibration, I think we will make it frequently initially. If we realise K doesn’t change much regarding to our precision expectations, we will calibrate less or just when environnmental conditions are drastically different.

Okay, at the moment,we’ll try to make it simple. We have very changing meteo conditions in our experimental areas, so we’ll be able to see if it still functions when sun angle or type of irradiance vary, even if Sunshine sensor receives the same Irradiance value…

I will keep you informated if we have interesting results about these topics!

Have a good day.