七牛云推流SDK使用

2019-07-13 04:08发布

http://www.jianshu.com/p/e4c43c6551d1

七牛云直播Android推流端之开速开发

字数997 阅读339 评论9 

前言

在我看来,定性为快速开发的文档应该是毫无障碍的,对着敲应该就能直接运行的。可是由于七牛迭代太快了,文档跟不上代码迭代的速度,导致快速开始这部分文档的还没更新,很多被废弃的类、方法还在文档中,导致刚入手的时候各种报错,当然如果对照着前面的更新说明和demo,做相应的调整,快速开发也确实可以谈得上,可却跟我所理解的有些不同了。所以本文基于2.1.0版本为大家呈现一个七牛云直播Android端的快速开发。(风格按七牛来)

快速开始

开发环境配置

  • 已全部完成 BOOK - I 中的所有操作。搭建好带有 Pili server sdk 的业务服务端,SDK 推流信息的输入来自服务端返回的 StreamJson
  • Android Studio 开发工具。官方下载地址
  • 下载 Android 官方开发SDK 。官方下载地址PLDroidMediaStreaming 软编要求 Min API 15 和硬编要求 Android Min API 18
  • 下载 PLDroidMediaStreaming 最新的 JAR 和 SO 文件。下载地址
  • 请用真机调试代码,模拟器无法调试。

    创建新工程

  • 通过Android studio创建Project

    new project.png
  • 设置新项目
    • 填写 Application id
    • 填写 Company Domain
    • 填写 Package id
    • 选择 Project location
    • 可以使用默认的填写项

new project.png
  • 选择 Target Android Devices
    本例中选择使用 MinimumSDK API 18(软编要求 MinimumSDK API 15 ; 硬编要求 MinimumSDK API 18)

Target.png
  • 选择 Empty Activity

Paste_Image.png
  • 填写 Main Activity 信息,作为 android.intent.action.MAIN

Paste_Image.png
  • 完成创建

Paste_Image.png

导入SDK

  • 将左侧文件夹目录切换为 Project视图

snipaste20161110_094628.jpg
  • 在 app/src/main 目录下创建 jniLibs 目录。按图所示,将文件导入对应的目录。

snipaste20161110_094922.jpg
  • 选中 lib 目录下 pldroid-media-streaming-2.1.0.jar,右键添加新建库,如图所示(这个忘记截图了,结果想再截,就算把dependencies里的compile删了发现右键也不会出来了,所以直接用七牛的吧)

Paste_Image.png
  • 导入完成,双击 build.gradle文件查看内容,lib 目录中的文件已经自动导入,涉及的文件名如下:
// jar pldroid-media-streaming-2.1.0.jar // so libpldroid_mmprocessing.so libpldroid_streaming_aac_encoder.so libpldroid_streaming_core.so libpldroid_streaming_h264_encoder.so

创建基础播放实例

添加相关权限

  • 在 app/src/main 目录中的 AndroidManifest.xml 中增加 uses-permission
    和 uses-feature声明
<manifest> ...... <uses-permission android:name="android.permission.INTERNET" /> <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" /> <uses-permission android:name="android.permission.RECORD_AUDIO" /> <uses-permission android:name="android.permission.CAMERA" /> <uses-permission android:name="android.permission.WAKE_LOCK" /> <uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" /> <uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" /> <uses-feature android:name="android.hardware.camera.autofocus" /> <uses-feature android:glEsVersion="0x00020000" android:required="true" /> ...... manifest>

添加 happy-dns 依赖

  • 打开 app 目录下的 build.gradle,在dependencies添加两条语句
dependencies { ...... compile 'com.qiniu:happy-dns:0.2.+' compile 'com.qiniu.pili:pili-android-qos:0.8.+' }

实现自己的 Application

public class SimplePlayerApplication extends Application { @Override public void onCreate() { super.onCreate(); StreamingEnv.init(getApplicationContext()); } } 在AndroidManifest的application标签中加上你的Application,如 android:name=".SimplePlayerApplication"

创建主界面

由于是快速开发demo,所以主界面很简单,只有两个按钮,一个推流,一个观看,这里只讲推流的。mainactivity非常简单,只是实现了按钮的点击事件,当然还有些判断权限、添加权限的代码,然后异步请求stream的代码我也省去了。所以最简单的代码如下: public class MainActivity extends AppCompatActivity { private static final String TAG = "MainActivity"; private android.widget.Button btnpili; private android.widget.Button btnplay; private boolean mPermissionEnabled = false; @Override protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_main); this.btnplay = (Button) findViewById(R.id.btn_play); this.btnpili = (Button) findViewById(R.id.btn_pili); //开始直播 btnpili.setOnClickListener(new View.OnClickListener() { @Override public void onClick(View v) { startActivity(new Intent(MainActivity.this, HWCameraStreamingActivity.class)); } }); //观看直播 btnplay.setOnClickListener(new View.OnClickListener() { @Override public void onClick(View v) { } }); } }

创建主界面布局文件

activity_main.xml: <RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android" xmlns:tools="http://schemas.android.com/tools" android:id="@+id/activity_main" android:layout_width="match_parent" android:layout_height="match_parent" android:paddingBottom="@dimen/activity_vertical_margin" android:paddingLeft="@dimen/activity_horizontal_margin" android:paddingRight="@dimen/activity_horizontal_margin" android:paddingTop="@dimen/activity_vertical_margin" tools:context="com.jcmels.liba.simpleplayerdemo.MainActivity"> <Button android:id="@+id/btn_pili" android:layout_width="match_parent" android:layout_height="wrap_content" android:text="开始直播"/> <Button android:id="@+id/btn_play" android:layout_below="@id/btn_pili" android:layout_width="match_parent" android:layout_height="wrap_content" android:text="观看直播"/> RelativeLayout>

创建推流界面(以HW为例,七牛是SW)

  • 创建名为 HWCameraStreamingActivity的 Empty Activity,HWCameraStreamingActivity的主要工作包括:
    • 配置推流url
    • 初始化推流 SDK 的核心类 MediaStreamingManager
    • onResume中调用 streamingManager.onResume();
    • 在接收到 READY指令之后,开始推流 streamingManager.startStreaming();,startStreaming需要在非 UI 线程中进行操作。
  • 新建Activity:HWCameraStreamingActivity,代码如下:
public class HWCameraStreamingActivity extends Activity implements StreamingStateChangedListener, CameraPreviewFrameView.Listener { private MediaStreamingManager streamingManager; private StreamingProfile streamingProfile; private MicrophoneStreamingSetting mMicrophoneStreamingSetting; @Override protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON); setContentView(R.layout.activity_hwcamera_streaming); AspectFrameLayout afl = (AspectFrameLayout) findViewById(R.id.cameraPreview_afl); afl.setShowMode(AspectFrameLayout.SHOW_MODE.REAL); CameraPreviewFrameView cameraPreviewFrameView = (CameraPreviewFrameView) findViewById(R.id.cameraPreview_surfaceView); cameraPreviewFrameView.setListener(this); String publishurl = "这里换成你的推流地址"; streamingProfile = new StreamingProfile(); try { streamingProfile.setVideoQuality(StreamingProfile.VIDEO_QUALITY_MEDIUM2) .setAudioQuality(StreamingProfile.AUDIO_QUALITY_MEDIUM2) // .setPreferredVideoEncodingSize(960, 544) .setEncodingSizeLevel(StreamingProfile.VIDEO_ENCODING_HEIGHT_480) .setEncoderRCMode(StreamingProfile.EncoderRCModes.BITRATE_PRIORITY) // .setAVProfile(avProfile) .setDnsManager(getMyDnsManager()) .setAdaptiveBitrateEnable(true) .setFpsControllerEnable(true) .setStreamStatusConfig(new StreamingProfile.StreamStatusConfig(3)) .setPublishUrl(publishurl) // .setEncodingOrientation(StreamingProfile.ENCODING_ORIENTATION.PORT) .setSendingBufferProfile(new StreamingProfile.SendingBufferProfile(0.2f, 0.8f, 3.0f, 20 * 1000)); CameraStreamingSetting setting = new CameraStreamingSetting(); setting.setCameraId(Camera.CameraInfo.CAMERA_FACING_BACK) .setContinuousFocusModeEnabled(true) .setCameraPrvSizeLevel(CameraStreamingSetting.PREVIEW_SIZE_LEVEL.MEDIUM) .setCameraPrvSizeRatio(CameraStreamingSetting.PREVIEW_SIZE_RATIO.RATIO_16_9); streamingManager = new MediaStreamingManager(this, afl, cameraPreviewFrameView, AVCodecType.HW_VIDEO_WITH_HW_AUDIO_CODEC); // hw codec // soft codec mMicrophoneStreamingSetting = new MicrophoneStreamingSetting(); mMicrophoneStreamingSetting.setBluetoothSCOEnabled(false); streamingManager.prepare(setting, mMicrophoneStreamingSetting, null, streamingProfile); streamingManager.setStreamingStateListener(this); } catch (URISyntaxException e) { e.printStackTrace(); } } @Override protected void onResume() { super.onResume(); streamingManager.resume(); } @Override protected void onPause() { super.onPause(); // You must invoke pause here. streamingManager.pause(); } @Override public void onStateChanged(StreamingState streamingState, Object o) { switch (streamingState) { case PREPARING: break; case READY: // start streaming when READY new Thread(new Runnable() { @Override public void run() { if (streamingManager != null) { streamingManager.startStreaming(); } } }).start(); break; case CONNECTING: break; case STREAMING: // The av packet had been sent. break; case SHUTDOWN: // The streaming had been finished. break; case IOERROR: // Network connect error. break; case SENDING_BUFFER_EMPTY: break; case SENDING_BUFFER_FULL: break; case AUDIO_RECORDING_FAIL: // Failed to record audio. break; case OPEN_CAMERA_FAIL: // Failed to open camera. break; case DISCONNECTED: // The socket is broken while streaming break; } } private static DnsManager getMyDnsManager() { IResolver r0 = new DnspodFree(); IResolver r1 = AndroidDnsServer.defaultResolver(); IResolver r2 = null; try { r2 = new Resolver(InetAddress.getByName("119.29.29.29")); } catch (IOException ex) { ex.printStackTrace(); } return new DnsManager(NetworkInfo.normal, new IResolver[]{r0, r1, r2}); } @Override public boolean onSingleTapUp(MotionEvent e) { return false; } @Override public boolean onZoomValueChanged(float factor) { return false; } }

创建CameraPreviewFrameView

public class CameraPreviewFrameView extends GLSurfaceView { private static final String TAG = "CameraPreviewFrameView"; public interface Listener { boolean onSingleTapUp(MotionEvent e); boolean onZoomValueChanged(float factor); } private Listener mListener; private ScaleGestureDetector mScaleDetector; private GestureDetector mGestureDetector; public CameraPreviewFrameView(Context context) { super(context); initialize(context); } public CameraPreviewFrameView(Context context, AttributeSet attrs) { super(context, attrs); initialize(context); } public void setListener(Listener listener) { mListener = listener; } @Override public boolean onTouchEvent(MotionEvent event) { if (!mGestureDetector.onTouchEvent(event)) { return mScaleDetector.onTouchEvent(event); } return false; } private GestureDetector.SimpleOnGestureListener mGestureListener = new GestureDetector.SimpleOnGestureListener() { @Override public boolean onSingleTapUp(MotionEvent e) { if (mListener != null) { mListener.onSingleTapUp(e); } return false; } }; private ScaleGestureDetector.SimpleOnScaleGestureListener mScaleListener = new ScaleGestureDetector.SimpleOnScaleGestureListener() { private float mScaleFactor = 1.0f; @Override public boolean onScaleBegin(ScaleGestureDetector detector) { return true; } @Override public boolean onScale(ScaleGestureDetector detector) { // factor > 1, zoom // factor < 1, pinch mScaleFactor *= detector.getScaleFactor(); // Don't let the object get too small or too large. mScaleFactor = Math.max(0.01f, Math.min(mScaleFactor, 1.0f)); return mListener != null && mListener.onZoomValueChanged(mScaleFactor); } }; private void initialize(Context context) { Log.i(TAG, "initialize"); mScaleDetector = new ScaleGestureDetector(context, mScaleListener); mGestureDetector = new GestureDetector(context, mGestureListener); } }

创建推流界面布局文件

<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android" xmlns:tools="http://schemas.android.com/tools" android:id="@+id/content" android:layout_width="match_parent" android:layout_height="match_parent" android:background="@color/background_floating_material_dark" tools:context=".HWCameraStreamingActivity" > <com.qiniu.pili.droid.streaming.widget.AspectFrameLayout android:id="@+id/cameraPreview_afl" android:layout_width="match_parent" android:layout_height="match_parent" android:layout_centerHorizontal="true" android:layout_alignParentTop="true"> <com.jcmels.liba.simpleplayerdemo.CameraPreviewFrameView android:id="@+id/cameraPreview_surfaceView" android:layout_width="match_parent" android:layout_height="match_parent" android:layout_gravity="center" /> com.qiniu.pili.droid.streaming.widget.AspectFrameLayout> RelativeLayout> 启动 APP 之后,当点击 开始直播,就可以开始推流了。

测试播放效果

  • 测试方法: 从 app server 获取到推流对应的播放地址,输入到播放器中进行播放。

后记

通过快速开发,我们能做到的是最基础的推流功能,更为高级的功能还需要进一步的编写。