it's just that I assume a single invocation of the command setCameraOrientation sets the absolute tilt and pan values for the camera. So if I do setCameraOrientation(-60, 0) the camera should move from its current angle (0,0) to (-60,0), correct?
What happens instead is that it starts moving in the right direction (in this example downwards) but doesn't reach the desired position because it stops after ~0.5s. The result is that the camera points to an angle somewhere between (0,0) and (-60,0). (approximately (-5,0) because it moved downwards for such a short time)
I found a workaround, that is: I send the same command (setCameraOrientation(-60,0)) repeatedly all the time, then the moving of the camera doesn't stop until it reaches the desired anlgle. I just think it would be more practical and should be enough if a single invocation would already make the camera move to the desired angle.
I hope it's more clear now,