DJI Mobile SDK real-time RTSP video streaming

See original GitHub issue

I have been tackling this issue for a week now. Read numerous issues, couple from this very library even. Most of the answers or code snippets are no longer available anymore.

I have taken this repo as the base for this project: https://github.com/DJI-Mobile-SDK-Tutorials/Android-VideoStreamDecodingSample and tried to implement this library into it.

Managed to stream a very short video feed to my local machine and then it breaks for reasons beyond me.

Here is what I have at this very point:

public class StreamVideo extends MyClassNotNecessaryToPointOut {

    private long presentTimeUs = 0L;
    private RtspClient rtspClient = new RtspClient(new ConnectCheckerRtsp() {
        @Override
        public void onConnectionSuccessRtsp() {

        }

        @Override
        public void onConnectionFailedRtsp(String reason) {

        }

        @Override
        public void onNewBitrateRtsp(long bitrate) {

        }

        @Override
        public void onDisconnectRtsp() {

        }

        @Override
        public void onAuthErrorRtsp() {

        }

        @Override
        public void onAuthSuccessRtsp() {

        }
    });

    private MediaCodec.BufferInfo videoInfo = new MediaCodec.BufferInfo();
    private boolean started;
    protected VideoFeeder.VideoDataListener mReceivedVideoDataListener = (videoBuffer, size) -> {
        videoInfo.size = size;
        videoInfo.offset = 0;
        videoInfo.flags = MediaCodec.BUFFER_FLAG_PARTIAL_FRAME;
        videoInfo.presentationTimeUs = System.nanoTime() / 1000 - presentTimeUs;
        int naluType = videoBuffer[0] & 0x1f;
        //First keyframe received and you start stream.
        // Change conditional as you want but stream must start with a keyframe
        if (naluType == 5 && !rtspClient.isStreaming() && started) {
            videoInfo.flags = MediaCodec.BUFFER_FLAG_KEY_FRAME;
            Pair<ByteBuffer, ByteBuffer> videoData = decodeSpsPpsFromBuffer(videoBuffer, size);
            if (videoData != null) {
                rtspClient.setIsStereo(true);
                rtspClient.setSampleRate(44100);
                presentTimeUs = System.nanoTime() / 1000;
                ByteBuffer newSps = videoData.first;
                ByteBuffer newPps = videoData.second;
                rtspClient.setSPSandPPS(newSps, newPps, null);
                rtspClient.setProtocol(Protocol.TCP);
                rtspClient.connect(this.endpoint);
            }
        }
        ByteBuffer h264Buffer = ByteBuffer.wrap(videoBuffer);
        rtspClient.sendVideo(h264Buffer, videoInfo);

    };

    private VideoFeeder.VideoFeed standardVideoFeeder;
    private String endpoint = "rtsp://192.168.1.100:8554/dji/demo";

    public StreamVideo(String endpoint) {
        this.endpoint = endpoint;
        standardVideoFeeder = VideoFeeder.getInstance().provideTranscodedVideoFeed();
        standardVideoFeeder.addVideoDataListener(mReceivedVideoDataListener);
    }

    @Override
    public void start() {
        started = true;
    }

    @Override
    public void end() {
        started = false;
        rtspClient.disconnect();
    }

    private Pair<ByteBuffer, ByteBuffer> decodeSpsPpsFromBuffer(byte[] csd, int length) {
        byte[] mSPS = null, mPPS = null;
        int i = 0;
        int spsIndex = -1;
        int ppsIndex = -1;
        while (i < length - 4) {
            if (csd[i] == 0 && csd[i + 1] == 0 && csd[i + 2] == 0 && csd[i + 3] == 1) {
                if (spsIndex == -1) {
                    spsIndex = i;
                } else {
                    ppsIndex = i;
                    break;
                }
            }
            i++;
        }
        if (spsIndex != -1 && ppsIndex != -1) {
            mSPS = new byte[ppsIndex];
            System.arraycopy(csd, spsIndex, mSPS, 0, ppsIndex);
            mPPS = new byte[length - ppsIndex];
            System.arraycopy(csd, ppsIndex, mPPS, 0, length - ppsIndex);
        }
        if (mSPS != null && mPPS != null) {
            return new Pair<>(ByteBuffer.wrap(mSPS), ByteBuffer.wrap(mPPS));
        }
        return null;
    }
}

I am hosting https://github.com/pedroSG94/vlc-example-streamplayer as the server on my local machine and trying to access that using the VLC player.

DJI VideoFeeder is spitting out encoded h264 buffer. My question is what could be the reason for this bad video feed?

Reference: https://developer.dji.com/api-reference/android-api/BaseClasses/DJIVideoFeeder.html?search=videofeed&i=2& https://github.com/DJI-Mobile-SDK-Tutorials/Android-VideoStreamDecodingSample/issues/33

Issue Analytics

  • State:open
  • Created 2 years ago
  • Comments:17 (8 by maintainers)

github_iconTop GitHub Comments

1reaction
hogantancommented, Apr 19, 2022

Hello all, I am also trying to stream the raw drone data to an RTSP server.

However, I am having difficulty regarding the keyframes of the video data. As per above, the condition if (naluType == 5 && !rtspClient.isStreaming()) is never true for me. videoBuffer[0] is always 0 for me.

Does anyone know how to set the naluType or any way of getting the keyframe? Appreciate any advice! My code is the same as above and am using a Mavic 2 Pro.

0reactions
supr3mecommented, Nov 2, 2022

Hello,

Let me explain the reason about nalutype.

H264 get video info using sps and pps (nalu type 7 and 8) and frecuently is also included in keyframes (nalu type 5). This info (sps and pps) are necessary to start stream in RTMP and RTSP so you need to provide it. This is the reason about the nalutype problem. Normally you should get keyframes (nalutype 5) in a consistent interval of X seconds (X value is configured in encoder side in my case it is iFrameInterval value of prepareVideo method).

I was looking a bit in DJI SDK code and I did an alternative way re encoding h264 buffer because you are not getting video info. This is the example code. Maybe bugged (I can’t test it). Test it and let me know if you have any problem (read header comments to get more info about it):

import android.content.Context;
import android.os.Build;

import androidx.annotation.RequiresApi;

import com.pedro.encoder.video.FormatVideoEncoder;
import com.pedro.encoder.video.GetVideoData;
import com.pedro.encoder.video.VideoEncoder;
import com.pedro.rtplibrary.view.GlInterface;
import com.pedro.rtplibrary.view.OffScreenGlThread;

import dji.sdk.camera.VideoFeeder;
import dji.sdk.codec.DJICodecManager;

/**
 * NOTE: This is not tested.
 *
 * Example code of use for DJI to generate valid h264 to send t RtspClient or RtmpClient classes.
 * We are doing the following:
 * - Get frames from DJI device in h264.
 * - Decode this frames into GlInterface class using DJICodecManager.
 * - Copy frames from GlInterface to VideoEncoder surface.
 * - VideoEncoder detect that copy automatically and re encode it to h264 that should be valid to stream.
 *
 * After that, using GetVideoData provided in start method, you should get sps and pps in the onSpsPpsVps callback where you 
 * can start your stream and send data to stream using getVideoData callback.
 *
 * Extra info:
 * GlInterface is an off screen thread that provide a surfaceTexture of OpenGl that you can render. Also,
 * allow you to copy data of that surfaceTexture to other and adding filters, take photo, etc.
 * Keep in mind that onSpsPpsVps could be called multiple times so use a conditional is recommended.
 */
@RequiresApi(api = Build.VERSION_CODES.JELLY_BEAN_MR2)
public class DJIExample implements VideoFeeder.VideoDataListener {

  private DJICodecManager codecManager;
  private VideoEncoder videoEncoder;
  private GlInterface glInterface;
  //encoder and decoder configuration
  private final int width = 640;
  private final int height = 480;
  private final int fps = 30;
  private final int bitrate = 1200 * 1000;
  private final int iFrameInterval = 2;
  private boolean init = false;


  public void start(Context context, GetVideoData getVideoData) {
    if (init) { //maybe we should init this only one time. I'm not sure
      try {
        VideoFeeder.getInstance().getPrimaryVideoFeed().addVideoDataListener(this);
        init = true;
      } catch (Exception ignored) { }
    }
    videoEncoder = new VideoEncoder(getVideoData);
    videoEncoder.prepareVideoEncoder(width, height, fps, bitrate, 0, iFrameInterval, FormatVideoEncoder.SURFACE);
    glInterface = new OffScreenGlThread(context);
    glInterface.init();
    glInterface.setEncoderSize(width, height);
    glInterface.start();
    videoEncoder.start();
    glInterface.addMediaCodecSurface(videoEncoder.getInputSurface());
    codecManager = new DJICodecManager(context, glInterface.getSurfaceTexture(), width, height);
  }

  public void stop() {
    glInterface.removeMediaCodecSurface();
    videoEncoder.stop();
    glInterface.stop();
    codecManager.cleanSurface();
    videoEncoder = null;
    glInterface = null;
    codecManager = null;
  }

  @Override
  public void onReceive(byte[] bytes, int i) {
    if (codecManager != null) {
      codecManager.sendDataToDecoder(bytes, i);
    }
  }
}

Hi @pedroSG94 I have a problem: On SDK, I use getVideoStartCodeSize() process the h264 data, get nalutype, but only has naluType 1/7/9, never get 8 or 5 , image

how can I push this data use your library by rtmp? Thank you very much!

Read more comments on GitHub >

github_iconTop Results From Across the Web

Mobile SDK - DJI Developer
Description: The manager is used to stream the video to a RTMP server to do live streaming with DJI products. Class Members: State...
Read more >
Chapter 6: Live Streaming – DJI SDK Forum
The MSDK V5 supports RTMP(no RTMPS), GB28181, RTSP, and Agora(WebRTC) while the MSDK V4 only supports RTMP. The MSDK V5's live streaming does ......
Read more >
Anyone Know A Way To Capture Live RTSP Stream From DJI ...
DJI have a mobile SDK enabling access to live video and telemetry data. Writing a whole app just to siphon the live feed...
Read more >
How to Live Stream Using the DJI Fly App - YouTube
In this video, I'm going to talk about live streaming your drone footage using the DJI Fly app. I go live at the...
Read more >
Is it possible to access DJI Mavic's camera video and process ...
There are two options: Turn on a video stream, grab it in external device (for example, via openCV), then process.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found