RemoteControl devices do not provide access to any video stream.
Instead, you need to obtain the Drone device that is connected to your SkyController3.
Then, you need to query the StreamServer peripheral from that Drone device. Using the StreamServer API, you have access to the CameraLive instance that represents the live video stream from the connected drone.
You can then plug this CameraLive object into a GsdkStreamView (see GsdkStreamView#setStream method), which is an android Widget that is able to render the stream on screen.
According to that I read in the API, RemoteControl has a DroneFinder Peripheral. This interface has a method getDiscoveredDrones() that returns a list of DroneFinder.DiscoveredDrone.
DiscoveredDrone doesn’t has a method to get the reference to a Drone class. Instead, this has a method to obtain the uid, along with others attributes.
The GroundSdk class has a method, ‘getDrone(String uid)’ thats get a drone by uid.
Is valid to do this to get the drone instance?, for example:
is there a way to actually access the raw h264 byteStream from the CameraLive instance? I’d like to 1) send the stream over network to a local PC and 2) render the stream inside one of my fragments inside the Android App.
However I’m facing two major problems:
As far as I can see, the CameraLive instance is kind of hardwired to be used in conjunction with a GsdkStreamView and I couldn’t find a way to access the raw stream so far.
The GroundSdk Session is bound to an Activity. So in case I am able to access the raw stream, I don’t want to get it interrupted on the Stream displayed on my local PC, if I change the activity. I already worked with the DJI Mobile SDK and handled the stream with a background service, which is not possible with the anafi due to the session being bound to an activity .
I’ll answer #2 and it may open the door for #1 because I think what you’re after is the underlying stream.
Use an unmanaged session. It is not activity bound and works really well across activities and services in my experience.
Another thought is you may want to bypass Gsdk entirely and interact with the Anafi’s Rtsp server directly. Something took into here would be TcpProxy class that Gsdk exposes when connected via a Sky Controller.
Regarding #2, as @synman suggested, you may used an unmanaged session (see: GroundSdk.newSession. Those kind of sessions accept any context, so you may use one in a background Service for instance (in that case the savedInstanceState parameter is irrelevant, you may pass null).
Regarding #1, you are right. For the time being the only public usable endpoint for a stream is a GsdkStreamView which will render the stream in an android view. There is no public API (yet) to access the raw h264 byteStream. @synman’s advice to bypass Gsdk and use TcpProxy may be a solution, but be aware that you are then relying on internal APIs and behaviors that may break in some future release of Gsdk.
And of course you could always modify the StreamServerCore / CameraLiveCore classes and/or build your own implementation classes within gsdk to get to the raw stream. It’s actually very easy to work with once you understand the pattern(s) Parrot implemented.
But yeah, like @mvinel said you’re on your own if you go that path
Thanks to your advice I’m able to access all necessary telemetry information in my App now using the unmanaged Session.
Regarding the LiveStream I haven’t found a solution yet as I’m not a fan of altering the basis of the SDK to access the raw stream and probably making other parts of the SDK unstable and/or unusable.
I tried to access the rtsp stream with no success; I assume it’s not accessible when simultaneously using the android groundsdk.
@mvinel You said there is no public API “yet”. Do you have any information on when the feature is expected to become available ? It’s the critical key feature in my app, which hinders development at the moment.