DingRTC为您提供了输入外部音视频流的功能。通过阅读本文,您可以了解输入外部音视频流的方法。

输入外部视频流

说明

SDK允许先推流然后开启外部视频输入,但这种情况下,默认开始推流时,先推送出的是本地原始采集源(摄像头或屏幕捕获)的视频数据,直到启用外部输入。

  1. 调用setExternalVideoSource注册外部视频接口回调。

    //获取DingRtcEngine实例
    DingRtcEngine mRtcEngine = DingRtcEngine.create(getApplicationContext(),"");
    mRtcEngine.setExternalVideoSource(
        boolean enable,
        boolean useTexture,
        DingRtcEngine.DingRtcVideoTrack.DingRtcVideoTrackCamera,
        DingRtcEngine.DingRtcRenderMode.DingRtcRenderModeAuto);
  2. 输入视频数据。

    接口说明如下所示。

    public abstract int pushExternalVideoFrame(DingRtcRawDataFrame rawDataFrame,DingRtcVideoTrack streameType);

    参数说明,AliRawDataFrame 必填项如下所示。

    参数

    类型

    描述

    rawDataFrame

    DingRtcRawDataFrame

    帧数据。

    streameType

    DingRtcVideoTrack

    视频流类型。

    示例方法如下所示。

    private void pushVideoFrame() {
        String yuvPath = "<!-请输入yuv文件地址->";
        if (TextUtils.isEmpty(yuvPath)) {
            // 请先选择YUV文件!!!
            return;
        }
        File yuvDataFile = new File(yuvPath);
        if (!yuvDataFile.exists()) {
            //  文件不存在!!!
            return;
        }
    
        //此处仅为示例代码做法,请使用正确的流角度
        int rotation = 90;
        //此处仅为示例代码做法,请使用正确的流宽高
        int width = 480;
        int height = 640;
        int fps = 25;
        //start push yuv
        new Thread() {
            @Override
            public void run() {
                File yuvDataFile = new File(yuvPath);
                RandomAccessFile raf = null;
                try {
                    raf = new RandomAccessFile(yuvDataFile, "r");
                } catch (FileNotFoundException e) {
                    e.printStackTrace();
                    return;
                }
                try {
                    byte[] buffer = new byte[width * height * 3 / 2];
                    while (!mIsStopPushVideo) { // mIsStopPushVideo 控制开始或者停止视频推流
                        long start = System.currentTimeMillis();
                        int len = raf.read(buffer);
                        if (len == -1) {
                            raf.seek(0);
                        }
                        DingRtcEngine.DingRtcRawDataFrame rawDataFrame = new DingRtcEngine.DingRtcRawDataFrame();
                        rawDataFrame.format = DingRtcEngine.DingRtcVideoFormat.DingRtcVideoFormatI420;// 支持I420
                        rawDataFrame.frame = buffer;
                        rawDataFrame.width = width;
                        rawDataFrame.height = height;
                        rawDataFrame.lineSize[0] = width;
                        rawDataFrame.lineSize[1] = width / 2;
                        rawDataFrame.lineSize[2] = height / 2;
                        rawDataFrame.lineSize[3] = 0;
                        rawDataFrame.rotation = rotation;
                        rawDataFrame.videoFrameLength = buffer.length;
                        rawDataFrame.timestamp = System.nanoTime() / 1000;
                        if (mRtcEngine != null) {
                            mRtcEngine.pushExternalVideoFrame(rawDataFrame, DingRtcEngine.DingRtcVideoTrack.DingRtcVideoTrackCamera);
                        }
                        //通过sleep的时间来控制发送的帧率
                        long use = System.currentTimeMillis() - start;
                        long delay = 1000/fps - use;
                        if(delay < 0) {
                        	delay = 0;
                        }
                        Thread.sleep(delay);
                    }
                } catch (IOException | InterruptedException ex) {
                    ex.printStackTrace();
                } finally {
                    try {
                        raf.close();
                    } catch (IOException e) {
                        e.printStackTrace();
                    }
                }
            }
        }.start();
    }
  3. 调用setExternalVideoSource取消注册外部视频接口回调。

    //获取DingRtcEngine实例
    DingRtcEngine mRtcEngine = DingRtcEngine.create(getApplicationContext(),"");
    mRtcEngine.setExternalVideoSource(
        boolean enable,
        boolean useTexture,
        DingRtcEngine.DingRtcVideoTrack.DingRtcVideoTrackCamera,
        DingRtcEngine.DingRtcRenderMode.DingRtcRenderModeAuto);

    参数

    类型

    描述

    enable

    boolean

    启用外部视频输入源,取值:true:启用。false(默认值):关闭。

    useTexture

    boolean

    是否使用texture模式,取值:true:使用。false(默认值):不使用。

    type

    AliRtcVideoTrack

    视频流类型。

    renderMode

    AliRtcRenderMode

    渲染模式。

输入外部音频流

  1. 调用setExternalAudioSource启用外部音频输入。

    //获取DingRtcEngine实例
    DingRtcEngine mRtcEngine = DingRtcEngine.create(getApplicationContext(),"");
    //设置开启外部音频输入源
    mRtcEngine.setExternalAudioSource(true, 44100, 1);
  2. 调用pushExternalAudioFrame输入音频数据。

    private void decodePCMRawData(){
        String pcmPath = "/sdcard/123.pcm";
        if (TextUtils.isEmpty(pcmPath)) {
            ToastUtils.LongToast("请先选择PCM文件!!!");
            return;
        }
        File pcmDataFile = new File(pcmPath);
        if (!pcmDataFile.exists()) {
            ToastUtils.LongToast(pcmPath + " 文件不存在!!!");
            return;
        }
        ToastUtils.LongToast("inputing pcm data");
    
        new Thread() {
            @Override
            public void run() {
                File pcmDataFile = new File(pcmPath);
                RandomAccessFile raf = null;
                try {
                    raf = new RandomAccessFile(pcmDataFile, "r");
                } catch (FileNotFoundException e) {
                    e.printStackTrace();
                    return;
                }
                try {
                    int sampleRate = 44100;
                    int length_ms = 10;
                    int numChannels = 1;
                    int bytePerSample = 2 * numChannels;
                    ByteBuffer byteBuffer = ByteBuffer.allocateDirect(sampleRate * length_ms * numChannels * bytePerSample / 1000);
                    final int sizeInBytes = byteBuffer.capacity();
                    while (true) {
                        long start = System.currentTimeMillis();
                        int len = raf.read(byteBuffer.array());
                        if (len == -1) {
                            raf.seek(0);
                        }
                        DingRtcEngine.DingRtcAudioFrame audioFrame = new DingRtcEngine.DingRtcAudioFrame();
                        audioFrame.data = byteBuffer;
                        audioFrame.numSamples = sampleRate * length_ms /1000;
                        audioFrame.bytesPerSample = bytePerSample;
                        audioFrame.numChannels = numChannels;
                        audioFrame.samplesPerSec = sampleRate;
                        mRtcEngine.pushExternalAudioFrame(audioFrame);
                        byteBuffer.rewind();
                        long end = System.currentTimeMillis();
                        long sleep = 10 - (end - start);
                        if(sleep > 0) {
                            SystemClock.sleep(sleep);
                        }
                    }
                } catch (IOException | InterruptedException ex) {
                    ex.printStackTrace();
                } finally {
                    try {
                        raf.close();
                    } catch (IOException e) {
                        e.printStackTrace();
                    }
                }
            }
        }.start();
    }