The Parrot Sequoia has a 10 bit maximum sensor, however, all tiff images that are taken as 10 bit images are saved as 16 bit Tiff files.
Could someone please explain the implications of this? How does this affect the raw 10 bit data when it’s converted to a 16 bit file?
How does this change the Digital Numbers of what the sensor is capturing?
In terms of radiometric calibration, what formula needs to be applied to offset the change from 10 bit to 16 bit?