【www.30064.com】unity中播放录制,ES来播音录像

unity中播放录制步骤如下:

unity伍.6初始扩充了videoPlayer,使得摄像播放绝相比较较简单,项目须求进行了弹指间研商选用,也赶上不少坑,Google百度时而发现确实有那些标题,一些回顾难题如下:

在讲代码达成之前,笔者先讲讲TextureView, SurfaceTexture,OpenGL
ES都是些什么鬼东西,作者又是怎么选用那多少个东西来呈现多个录像的。

按: 近期做了3个直播的预备性探讨项目,
由此记录下直播的技能的兑现,在那进程中有的题材化解的思绪,以android平台的落实认证。

一.将要播放的录像拖入projec。(注意:unity①般扶助的录像格式有mov, .mpg,
.mpeg, .mp3,.avi, .asf格式  )

一)播放无声音

【www.30064.com】unity中播放录制,ES来播音录像。TextureView
顾名思义相当于三个继承了View的1个View控件而已,官网的分解是如此的:
A TextureView can be used to display a content stream. Such a content
stream can for instance be a video or an OpenGL scene. The content
stream can come from the application’s process as well as a remote
process.

它亦可去展现一个剧情流,比如录制流,OpenGL渲染的风貌等。这么些流能够是地点程序进程也得以是长途进程流,有点绕,作者的精通正是,比如既能够是本地摄像流,也可以是互联网录像流。
留意的是: TextureView
采用的是硬件加快器去渲染,就好像摄像的硬解码跟软解码,多少个靠的是GPU解码,2个靠CPU解码。
那么如何去选择这么些TextureView呢?
OK,现在SurfaceTexture将要上场了,从那五个类的命名大家就驾驭TextureView重点是View,而SurfaceTexture
重点是Texture它的官网解释:
Captures frames from an image stream as an OpenGL ES texture.The image
stream may come from either camera preview or video decode. \

也正是说它能捕获3个图像流的一帧来作为OpenGL
的texture也正是纹理。那么些图片流首如果出自相机的预览或录制的解码。(小编想这么些特点是不该能够用来做过多事了)。
到这儿,texture也有了,那么OpenGL\也就足以出去干活了,它可以绑定texture并将其在TextureView上壹帧一帧的给绘制出来,就形成了我们所看到录像图像了(切实有关SurfaceTexture、TextureView我们可以参考这里)
说了如此,是该来点代码来瞧瞧了,好的代码就跟读医学随笔同等,那样的雅观,并不是说作者写的代码非常美丽艳啦,这只是追求。。。

类型布局

二.在场景中添加RawImage。(因为Image使用sprite渲染,rawImage是用texture渲染)

二)通过slider控制作和播出放进度

代码

先从MainActicity主类伊始:

public class MainActivity extends AppCompatActivity implements TextureView.SurfaceTextureListener,
        MediaPlayer.OnPreparedListener{
    /**本地视频的路径*/
    public String videoPath = Environment.getExternalStorageDirectory().getPath()+"/aoa.mkv";
    private TextureView textureView;
    private MediaPlayer mediaPlayer;
    /**
    * 视频绘制前的配置就发生在这个对象所在类中.
    * 真正的绘制工作则在它的子类中VideoTextureSurfaceRenderer
    */
    private TextureSurfaceRenderer videoRenderer;
    private int surfaceWidth;
    private int surfaceHeight;
    private Surface surface;


    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);

        textureView = (TextureView) findViewById(R.id.id_textureview);
        //注册一个SurfaceTexture,用于监听SurfaceTexure
        textureView.setSurfaceTextureListener(this);

    }
    /**
    * 播放视频的入口,当SurfaceTexure可得到时被调用
    */
    private void playVideo() {
        if (mediaPlayer == null) {
            videoRenderer = new VideoTextureSurfaceRenderer(this, textureView.getSurfaceTexture(), surfaceWidth, surfaceHeight);
            surface = new Surface(videoRenderer.getSurfaceTexture());
            initMediaPlayer();
        }
    }

    private void initMediaPlayer() {
        this.mediaPlayer = new MediaPlayer();
        try {
            mediaPlayer.setDataSource(videoPath);
            mediaPlayer.setSurface(surface);
            mediaPlayer.prepareAsync();
            mediaPlayer.setOnPreparedListener(this);
            mediaPlayer.setLooping(true);
        } catch (IllegalArgumentException e1) {
            // TODO Auto-generated catch block
            e1.printStackTrace();
        } catch (SecurityException e1) {
            // TODO Auto-generated catch block
            e1.printStackTrace();
        } catch (IllegalStateException e1) {
            // TODO Auto-generated catch block
            e1.printStackTrace();
        } catch (IOException e1) {
            // TODO Auto-generated catch block
            e1.printStackTrace();
        }
    }
    @Override
    public void onPrepared(MediaPlayer mp) {
        try {
            if (mp != null) {
                mp.start(); //视频开始播放了
            }
        } catch (IllegalStateException e) {
            e.printStackTrace();
        }
    }


    @Override
    protected void onResume() {
        super.onResume();
        if (textureView.isAvailable()) {
            playVideo();
        }
    }

    @Override
    protected void onPause() {
        super.onPause();
        if (videoRenderer != null) {
            videoRenderer.onPause();  //记得去停止视频的绘制线程
        }
        if (mediaPlayer != null) {
            mediaPlayer.release();
            mediaPlayer =null;
        }
    }

    @Override
    public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int height) {
        surfaceWidth = width;
        surfaceHeight = height;
        playVideo();
    }

    @Override
    public void onSurfaceTextureSizeChanged(SurfaceTexture surface, int width, int height) {

    }

    @Override
    public boolean onSurfaceTextureDestroyed(SurfaceTexture surface) {
        return false;
    }

    @Override
    public void onSurfaceTextureUpdated(SurfaceTexture surface) {

    }

}

那正是先后的入口类,关于Mediaplayer是怎么播放时录制源的,作者就在此就隐瞒了,那里面其实还有不少事物的,大家能够活动的检查。有一点小编急需说说正是,一般MediaPlayer.setSurface(param)里头的参数param都以SurfaceView.SurfaceHolder,而自小编那时平素用的是Surface
(关于Surface能够参照那里),我这么些录制播放与任何的录像播放的区分就在此。那篇先权且写在那时候啦,后续宗旨的绘图工作,就前面有空就再写了。上边写的比方有哪些难题期待我们能多多辅导,多谢不尽!
下1篇已写好TextureView+SurfaceTexture+OpenGL
ES来播音录制(二)

  • unity纹理插件和录制采访(摄像源)
    www.30064.com,VideoSourceCamera
  • 迈克风范集(音频源)
    AudioSourceMIC
  • 摄像编码
    VideoEncoder
  • 旋律编码
    AudioEncoder
  • FLV编码(混合)
    MuxerFLV
  • http流上传(上传源)
    PublisherHttp
  • 流摄像播放(重放)
    play
  • OpenGL图形图象处理

3.rawImage下添加videoPlayer组件,将摄像赋给videoplayer,将其拖到video
clip上。

3)摄像截图(texture->texture二d)

从本篇文章开端将会介绍那多少个零部件的落到实处细节,互相重视关系的处理形式。

肆.创设脚本PlayVodeoOnUGUI,宗旨代码:rawImage.texture =
videoPlayer.texture,即将video的tuxture赋值给rawImage就能见到要播放的录像了

四)录像甘休时事件激活

(1) —— unity纹理插件

咱俩的直播项目服务于unity,而unity是一个跨平台的游艺引擎,底层依照差别平台,采纳了directx,
opengl, opengles, 由此须要贯彻区别平台的图纸插件。
(unity的图样插件文档)
https://docs.unity3d.com/Manual/NativePluginInterface.html
在anroid平台下的直播,unity图形插件效能重若是渲染线程布告,
因为不论是录像采访,创造苹果平板,
图像处理(shader),依旧编码录制纹理传入,都亟需工作在unity的渲染线程下,

  • unity创立纹理,将纹理ID传递到直播插件。

  • 打开camera设备,准备好采访华为平板,
    mCameraGLTexture =
    new GLTexture(width, height, GLES11Ext.GL_TEXTURE_EXTERNAL_OES,
    GLES20.GL_RGBA);
    note: camera
    平板电脑是一种尤其类别的纹路,通过GLES1一Ext.GL_TEXTURE_EXTERNAL_OES参数创立

  • 回调布告每一帧数据准备完毕
    public void onFrameAvailable(final SurfaceTexture surfaceTexture)
    {
    //那里将采访线程的图象push到渲染线程处理
    getProcessor().append (new Task() {
    @Override
    public void run() {
    surfaceTexture.updateTexImage();
    }
    });
    }

    camera 三星平板也急需做特殊纹理评释

      #extension GL_OES_EGL_image_external : require
      precision mediump float;
      uniform samplerExternalOES uTexture0;
      varying vec2 texCoordinate;
      void main(){
          gl_FragColor = texture2D(uTexture0, texCoordinate);
      }
    
  • 将camera 平板电脑纹理写入到 unity的纹路,
    将一张纹理写入到另一纹理,能够三种办法,

    • 由此glReadPixels, 但那样会造成巨大的内部存款和储蓄器拷贝,CPU压力。

    • 渲染到纹理(render to texture)
      mTextureCanvas = new
      GLRenderTexture(mGLTexture);//声明rendertexture

        void renderCamera2Texture()
        {
            mTextureCanvas.begin();
            cameraDrawObject.draw();
            mTextureCanvas.end();
        }
      

      GLRenderTexture的实现, 如下
      GLRenderTexture(GLTexture tex)
      {
      mTex = tex;
      int fboTex = tex.getTextureID();
      GLES20.glGenFramebuffers(1, bufferObjects, 0);
      GLHelper.checkGlError(“glGenFramebuffers”);
      fobID = bufferObjects[0];

            //创建render buffer
            GLES20.glGenRenderbuffers(1, bufferObjects, 0);
            renderBufferId = bufferObjects[0];
            //绑定Frame buffer
            GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, fobID);
            GLHelper.checkGlError("glBindFramebuffer");
            //Bind render buffer and define buffer dimension
            GLES20.glBindRenderbuffer(GLES20.GL_RENDERBUFFER, renderBufferId);
            GLHelper.checkGlError("glBindRenderbuffer");
            GLES20.glRenderbufferStorage(GLES20.GL_RENDERBUFFER, GLES20.GL_DEPTH_COMPONENT16, tex.getWidth(), tex.getHeight());
            GLHelper.checkGlError("glRenderbufferStorage");
            //设置为framebuffer为texutre类型
            GLES20.glFramebufferTexture2D(GLES20.GL_FRAMEBUFFER, GLES20.GL_COLOR_ATTACHMENT0, GLES20.GL_TEXTURE_2D, fboTex, 0);
            GLHelper.checkGlError("glFramebufferTexture2D");
            //设置depthbuffer
            GLES20.glFramebufferRenderbuffer(GLES20.GL_FRAMEBUFFER, GLES20.GL_DEPTH_ATTACHMENT, GLES20.GL_RENDERBUFFER, renderBufferId);
            GLHelper.checkGlError("glFramebufferRenderbuffer");
            //we are done, reset
            GLES20.glBindRenderbuffer(GLES20.GL_RENDERBUFFER, 0);
            GLHelper.checkGlError("glBindRenderbuffer");
            GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, 0);
            GLHelper.checkGlError("glBindFramebuffer");
        }
      
        void begin()
        {
            GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, fobID);
            GLHelper.checkGlError("glBindFramebuffer");
            GLES20.glViewport(0, 0, mTex.getWidth(), mTex.getHeight());
            GLHelper.checkGlError("glViewport");
        }
      
        void end()
        {
            GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, 0);
        }
      
  • 美颜
    通过shader实现实时的美颜效用,美白,磨皮
    (美颜功能的规律可参看)
    http://meituplus.com/?p=101
    (更加多的实时shader处理可参看)
    https://github.com/wuhaoyu1990/MagicCamera

大多无太大题材,以上四个难题化解方案在下文高粱红文字区域,先介绍一下video
Player应用,后续对那多少个难题展开缓解。

 

(一)新建video Player 能够在ui下田间video
Play组建,也得以直接右键-video-videoplayer,添加后得以见到如下图所示的零件

www.30064.com 1

正文首要重点说一下一下参数:source有三种方式clip情势和url格局,clip则足以一贯通过videoClip进行广播,url则能够通过url实行播放。renderMode为渲染情势,既能够为camera,material等,若是是使用ui播放的选用render
texture,本文选用此形式。audioOutputMode有二种,none格局,direct格局(没尝试)和audiosource情势,本文选取audiosource情势,选取此格局时只须要将audiosource组建拖入上海体育场面中videoPlayer中的audiosource参数槽中即可,不须要此外处理,但神蹟会油但是生拖入后videoPlayer中的audiosource参数槽消失,且无声音播放,所以一般选取代码添加,如下所示:

 

      //代码添加
        videoPlayer = gameObject.AddComponent<VideoPlayer>();
        //videoPlayer = gameObject.GetComponent<VideoPlayer>();
        audioSource = gameObject.AddComponent<AudioSource>();
        //audioSource = gameObject.GetComponent<AudioSource>();
        videoPlayer.playOnAwake = false;
        audioSource.playOnAwake = false;
        audioSource.Pause();

 

(二)录像播放的决定与节奏/动画播放类似,videoPlayer有play/pause等措施,具体可以瞻仰后边完整代码。

         
在调用摄像播放完成时事件loopPointReached(此处为借鉴旁人称作,此事件实际并不是录制播放落成时的轩然大波),顾名思义,此事件为达标录制播放循环点时的事件,即当videoplay
的isLooping属性为true(即循环播放录像)时,录制甘休时调用此方法,所以当录像非循环播放时,此事件在录像截止时调用不到。要想调用此办法能够把录制安装为循环播放,在loopPointReached钦定的轩然大波中停播录制

(3)关于录制播放的ui选取题材,选取render texture时须要钦命target
texture。

      
1)在project面板上create-renderTexture,并把新建的renderTexture拖到videoplayer相应的参数槽上

      
2)在Hierarchy面板上新建ui-RawImage,并把上一步新建的renderTexture拖到RawImage的texture上即可。

      
其实能够不用那样处理,videoPlayer有texture变量,直接在update里面把texture值赋给RawImage的texture即可,代码如下

rawImage.texture = videoPlayer.texture;

      录像截图时可以经过videoPlayer.texture,把图像保存下去可是须求把texture转变为texture贰d,纵然后者继续在前者,可是力不从心强制转货回去,转换以及存储图片代码如下:

   private void SaveRenderTextureToPNG(Texture inputTex, string file)
    {
        RenderTexture temp = RenderTexture.GetTemporary(inputTex.width, inputTex.height, 0, RenderTextureFormat.ARGB32);
        Graphics.Blit(inputTex, temp);
        Texture2D tex2D = GetRTPixels(temp);
        RenderTexture.ReleaseTemporary(temp);
        File.WriteAllBytes(file, tex2D.EncodeToPNG());
    }

    private Texture2D GetRTPixels(RenderTexture rt)
    {
        RenderTexture currentActiveRT = RenderTexture.active;
        RenderTexture.active = rt;
        Texture2D tex = new Texture2D(rt.width, rt.height);
        tex.ReadPixels(new Rect(0, 0, tex.width, tex.height), 0, 0);
        RenderTexture.active = currentActiveRT;
        return tex;
    }

 

最后说一下因而slider控制录制播放进程的标题,

通过slider控制摄像播放存在四个难题,一方面在update实时把videoPlayer.time
赋值给slider,一方面需求把slider的value反馈给time,假如用slider的OnValueChanged(float
value)
方法则设有争辩,导致难题。所以能够通过UI事件的BeginDrag和EndDrag事件

事件实行,即当BeginDrag时,甘休给slider赋值,当EndDrag时再也初始赋值。如下图所示

www.30064.com 2

 全代码

using System;
using System.Collections;
using System.Collections.Generic;
using System.IO;
using UnityEngine;
using UnityEngine.UI;
using UnityEngine.Video;

public class VideoController : MonoBehaviour {
    public GameObject screen;
    public Text videoLength;
    public Text currentLength;
    public Slider volumeSlider;
    public Slider videoSlider;

    private string video1Url;
    private string video2Url;
    private VideoPlayer videoPlayer;
    private AudioSource audioSource;
    private RawImage videoScreen;
    private float lastCountTime = 0;
    private float totalPlayTime = 0;
    private float totalVideoLength = 0;

    private bool b_firstVideo = true;
    private bool b_adjustVideo = false;
    private bool b_skip = false;
    private bool b_capture = false;

    private string imageDir =@"D:\test\Test\bwadmRe";

    // Use this for initialization
    void Start () {
        videoScreen = screen.GetComponent<RawImage>();
        string dir = Path.Combine(Application.streamingAssetsPath,"Test");
        video1Url = Path.Combine(dir, "01.mp4");
        video2Url = Path.Combine(dir, "02.mp4");

        //代码添加
        videoPlayer = gameObject.AddComponent<VideoPlayer>();
        //videoPlayer = gameObject.GetComponent<VideoPlayer>();
        audioSource = gameObject.AddComponent<AudioSource>();
        //audioSource = gameObject.GetComponent<AudioSource>();
        videoPlayer.playOnAwake = false;
        audioSource.playOnAwake = false;
        audioSource.Pause();

        videoPlayer.audioOutputMode = VideoAudioOutputMode.AudioSource;
        videoPlayer.SetTargetAudioSource(0, audioSource);

        VideoInfoInit(video1Url);
        videoPlayer.loopPointReached += OnFinish;
    }

    #region private method
    private void VideoInfoInit(string url)
    {
        videoPlayer.source = VideoSource.Url;
        videoPlayer.url = url;        

        videoPlayer.prepareCompleted += OnPrepared;
        videoPlayer.isLooping = true;

        videoPlayer.Prepare();
    }

    private void OnPrepared(VideoPlayer player)
    {
        player.Play();
        totalVideoLength = videoPlayer.frameCount / videoPlayer.frameRate;
        videoSlider.maxValue = totalVideoLength;
        videoLength.text = FloatToTime(totalVideoLength);

        lastCountTime = 0;
        totalPlayTime = 0;
    }

    private string FloatToTime(float time)
    {
        int hour = (int)time / 3600;
        int min = (int)(time - hour * 3600) / 60;
        int sec = (int)(int)(time - hour * 3600) % 60;
        string text = string.Format("{0:D2}:{1:D2}:{2:D2}", hour, min, sec);
        return text;
    }

    private IEnumerator PlayTime(int count)
    {
        for(int i=0;i<count;i++)
        {
            yield return null;
        }
        videoSlider.value = (float)videoPlayer.time;
        //videoSlider.value = videoSlider.maxValue * (time / totalVideoLength);
    }

    private void OnFinish(VideoPlayer player)
    {
        Debug.Log("finished");        
    }

    private void SaveRenderTextureToPNG(Texture inputTex, string file)
    {
        RenderTexture temp = RenderTexture.GetTemporary(inputTex.width, inputTex.height, 0, RenderTextureFormat.ARGB32);
        Graphics.Blit(inputTex, temp);
        Texture2D tex2D = GetRTPixels(temp);
        RenderTexture.ReleaseTemporary(temp);
        File.WriteAllBytes(file, tex2D.EncodeToPNG());
    }

    private Texture2D GetRTPixels(RenderTexture rt)
    {
        RenderTexture currentActiveRT = RenderTexture.active;
        RenderTexture.active = rt;
        Texture2D tex = new Texture2D(rt.width, rt.height);
        tex.ReadPixels(new Rect(0, 0, tex.width, tex.height), 0, 0);
        RenderTexture.active = currentActiveRT;
        return tex;
    }
    #endregion

    #region public method
    //开始
    public void OnStart()
    {
        videoPlayer.Play();
    }
    //暂停
    public void OnPause()
    {
        videoPlayer.Pause();
    }
    //下一个
    public void OnNext()
    {
        string nextUrl = b_firstVideo ? video2Url : video1Url;
        b_firstVideo = !b_firstVideo;

        videoSlider.value = 0;
        VideoInfoInit(nextUrl);
    }
    //音量控制
    public void OnVolumeChanged(float value)
    {
        audioSource.volume = value;
    }
    //视频控制
    public void OnVideoChanged(float value)
    {
        //videoPlayer.time = value;
        //print(value);
        //print(value);
    }
    public void OnPointerDown()
    {
        b_adjustVideo = true;
        b_skip = true;
        videoPlayer.Pause();
        //OnVideoChanged();
        //print("down");
    }
    public void OnPointerUp()
    {
        videoPlayer.time = videoSlider.value;

        videoPlayer.Play();
        b_adjustVideo = false;  
        //print("up");
    }
    public void OnCapture()
    {
        b_capture = true;
    }
    #endregion

    // Update is called once per frame
    void Update () {
        if (videoPlayer.isPlaying)
        {            
            videoScreen.texture = videoPlayer.texture;
            float time = (float)videoPlayer.time;
            currentLength.text = FloatToTime(time);

            if(b_capture)
            {
                string name = DateTime.Now.Minute.ToString() + "_" + DateTime.Now.Second.ToString() + ".png";
                SaveRenderTextureToPNG(videoPlayer.texture,Path.Combine(imageDir,name));                
                b_capture = false;
            }

            if(!b_adjustVideo)
            {
                totalPlayTime += Time.deltaTime;
                if (!b_skip)
                {
                    videoSlider.value = (float)videoPlayer.time;
                    lastCountTime = totalPlayTime;
                }                
                if (totalPlayTime - lastCountTime >= 0.8f)
                {
                    b_skip = false;
                }
            }
            //StartCoroutine(PlayTime(15));   

        }
    }
}

 

 

 

假定利用插件AVPro Video所不常常不是题材

发表评论

电子邮件地址不会被公开。 必填项已用*标注

网站地图xml地图