This is one of my end applications of the Overo. I want to mount this device onto a rover/uav and get real-time streaming, good quality video. I also didn't want the operation to dominate the CPU. Ideally my other applications
will be able to run along side the video stream. Therefore, I had to use the DSP. So once I got it running it was time to stream.
这是Overo的其中一个终端应用程序。我想要这个设备连接在rover/uav上面,并且获取到实时的质量好的视频流媒体。我不想这个功能占用CPU。这样我的其他应用程序可以理想地与视频流媒体一起运行。
Scott's
discussion and example pipelines were great but I had previously tested some gstreamer code on Linux machines that I wanted to try. I got the code fromhere,
there's a whole bunch of sample files. What I initially did was change the serverserver-v4l2-H264-alsasrc-PCMA.sh code
to suit the TI codecs
scott讨论和管道使用的例子非常好:如下:
GStreamer Pipelines
On the Ubuntu client with an of IP 192.168.10.4, run a GStreamer command like the following
$ gst-launch-0.10 -v udpsrc port=4000
caps='application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264'
! rtph264depay ! ffdec_h264 ! xvimagesink sync=false
Then on the gumstix side run this, substituting your workstation's IP address
root@caspa:~# gst-launch -v videotestsrc ! video/x-raw-yuv,width=640,height=480
! TIVidenc1 codecName=h264enc engineName=codecServer ! rtph264pay pt=96
! udpsink host=192.168.10.4 port=4000
上面的例子是TI的,针对Freescale的imx6,硬件加速部分的pipeline需要修改一下。其中黄 {MOD}highlight部分就是更改的。
在imx6端,streaming to PC:
root@freescale ~$ gst-launch -v videotestsrc ! video/x-raw-yuv,width=640,height=
480 ! vpuenc codec=avc ! rtph264pay pt=96 ! udpsink host=172.21.78.36 port=4000
在ubuntu PC端 接收:
cbh@cbh:Desktop$ gst-launch -vvv udpsrc port=4000 caps="application/x-rtp,
media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264" ! rtph264depay ! ffdec_h264 ! xvimagesink
这里理解一下:(所有pipeline属性都可以通过gst-inspect来查看)
在imx6端streaming一个基于udp&rtp的视频测试源videotestsrc,并且设置这个测试源的属性或者叫caps。
格式为video/x-raw-yuv,width=640,height=480,这些属性是属于videotestsrc的pipeline。
raw的视频源通过imx6的硬件编码器vpuenc( gst-fsl-plugins里面的),这个元件的sink&sout是有规定范围的:
Pad Templates:
SINK template: 'sink'
Availability: Always
Capabilities:
video/x-raw-yuv
format: TNVP
video/x-raw-yuv
format: NV12
video/x-raw-yuv
format: I420
SRC template: 'src'
Availability: Always
Capabilities:
video/mpeg
mpegversion: 4
video/x-h263
video/x-h264
image/jpeg
显然video/x-raw-yuv符合规定的sink范围。SRC可以定义为上面几种编码方式。这里定义为codec=avc(即video/x-h264)
OK,编码为H264之后,再通过rtph264pay元件打包为rtp包发出去。其中定义The
payload type of the packets 96.同样这个
rtph264pay也是有规定的,他的sink必须要为video/x-h264。传输出去的caps就是rtp视频流媒体,9W的clock,H264的编码。如下:
Pad Templates:
SRC template: 'src'
Availability: Always
Capabilities:
application/x-rtp
media: video
payload: [ 96, 127 ]
clock-rate: 90000
encoding-name: H264
SINK template: 'sink'
Availability: Always
Capabilities:
video/x-h264
最后src out出来的给udpsink元件,即封装好的rtp包通过udp协议来传输。并且指定这个元件的相关属性如host port。
OK===>以上的测试例子:要先在接收端执行命令,然后再imx6进行发送。 如何解决:sprop-parameter-sets
上面的命令要先client端接收,然后再server发送。“不能先发送command,再接收command”。如果不想要这样的限制。需在接收端的caps里面添加sprop-parameter-sets参数。这个参数是从发送command中得到的。
cbh@cbh:Desktop$ gst-launch -v udpsrc port=4000 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264,sprop-parameter-sets=(string)"Z0JAHqaAoD2QAA\=\=\,aM4wpIAA"
" ! rtph264depay ! ffdec_h264 ! xvimagesink
With that change you can start and stop the client or server in any order.
接收端的解析:
cbh@cbh:Desktop$
gst-launch -vvv udpsrc port=4000 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264" ! rtph264depay ! ffdec_h264 ! xvimagesink sync=false
上面等价于
cbh@cbh:Desktop$ gst-launch udpsrc uri=udp://172.21.78.36:4000 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264" ! rtph264depay ! ffdec_h264 ! xvimagesink
sync=false
接收端等待接收udpsrc,这个源out出来的是数据是通过udp传输的,里面的数据是rtp&H264封装的。所以指定了其caps。
接着用rtph264depay将rtp解包,剩下H264的数据,再通过ffdec_h264进行解码,src
out为raw rgb或yuv
Pad Templates:
SRC template: 'src'
Availability: Always
Capabilities:
video/x-raw-rgb
video/x-raw-yuv
SINK template: 'sink'
Availability: Always
Capabilities:
video/x-h264
width: [ 16, 4096 ]
height: [ 16, 4096 ]
framerate: [ 0/1, 2147483647/1 ]
OK,我们再来看一个例子:将ubuntu的camera streaming to imx6 displaying。延迟非常少,但ubuntu很占CPU。
OK===>
ubuntu:
cbh@cbh:Desktop$ gst-launch v4l2src device=/dev/video0 ! 'video/x-raw-yuv,width=640,height=480' ! x264enc pass=qual quantizer=20 tune=zerolatency ! rtph264pay ! udpsink host=172.21.78.63 port=4000 -v
可以将
device=/dev/video0省略,因为camera缺省对应video0
device : Device location
flags: readable, writable
String. Default: "/dev/video0"
imx6:
gst-launch -v udpsrc port=4000 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000,encoding-name=(string)H264" ! rtph264depay ! vpudec ! mfw_v4lsink sync=false
可以简化为如下:
imx6:
gst-launch -v udpsrc port=4000 caps="application/x-rtp" ! rtph264depay ! vpudec ! mfw_v4lsink sync=false
并且VPUDEC可以自适应的解析H264,不想要指定具体的codec。
注意:与第一个例子,imx6 streaming to PC。相比。
第二个PC camera streaming to imx6 displaying。不用指定sprop-parameter-sets,都可以无限制的,client
server谁先执行都可以。不知道为什么?
Without that change you canalso start and stop the client or server in any order.
========================================
我们再来看看原文作者的方法:
这里的方法借鉴了gst-plugins-good/tests/examples/rtp里面的例子,相应HW codec进行更正。
gst-launch -v gstrtpbin name=rtpbin v4l2src ! queue ! videorate ! ffmpegcolorspace ! video/x-raw-yuv,width=640,height=480,framerate=15/1 ! TIVidenc1 codecName=h264enc engineName=codecServer ! rtph264pay ! rtpbin.send_rtp_sink_0 rtpbin.send_rtp_src_0
! udpsink port=5000 host=$DEST ts-offset=0 name=vrtpsink rtpbin.send_rtcp_src_0 ! udpsink port=5001 host=$DEST sync=false async=false name=vrtcpsink udpsrc port=5005 name=vrtpsrc ! rtpbin.recv_rtcp_sink_0
查看一下gstrtpbin的用法:
Pad Templates:
SINK template: 'recv_rtp_sink_%d'
Availability: On request
Has request_new_pad() function: gst_rtp_bin_request_new_pad
Capabilities:
application/x-rtp
SINK template: 'recv_rtcp_sink_%d'
Availability: On request
Has request_new_pad() function: gst_rtp_bin_request_new_pad
Capabilities:
application/x-rtcp
SINK template: 'send_rtp_sink_%d'
Availability: On request
Has request_new_pad() function: gst_rtp_bin_request_new_pad
Capabilities:
application/x-rtp
SRC template: 'recv_rtp_src_%d_%d_%d'
Availability: Sometimes
Capabilities:
application/x-rtp
SRC template: 'send_rtcp_src_%d'
Availability: On request
Has request_new_pad() function: gst_rtp_bin_request_new_pad
Capabilities:
application/x-rtcp
SRC template: 'send_rtp_src_%d'
Availability: Sometimes
Capabilities:
application/x-rtp
然后我们根据gst-plugins-good里面的例子,逐一分析:
先看这个脚本:server-alsasrc-PCMA.sh
#!/bin/sh
#
# A simple RTP server
# sends the output of autoaudiosrc as alaw encoded RTP on port 5002, RTCP is sent on
# port 5003. The destination is 127.0.0.1.
# the receiver RTCP reports are received on port 5007
#
# .--------. .-------. .-------. .----------. .-------.
# |audiosrc| |alawenc| |pcmapay| | rtpbin | |udpsink| RTP
# | src->sink src->sink src->send_rtp send_rtp->sink | port=5002
# '--------' '-------' '-------' | | '-------'
# | |
# | | .-------.
# | | |udpsink| RTCP
# | send_rtcp->sink | port=5003
# .-------. | | '-------' sync=false
# RTCP |udpsrc | | | async=false
# port=5007 | src->recv_rtcp |
# '-------' '----------'
# change this to send the RTP data and RTCP to another host
DEST=127.0.0.1
#AELEM=autoaudiosrc
AELEM=audiotestsrc
# PCMA encode from an the source
ASOURCE="$AELEM ! audioconvert"
AENC="alawenc ! rtppcmapay"
gst-launch -v gstrtpbin name=rtpbin
$ASOURCE ! $AENC ! rtpbin.send_rtp_sink_0
rtpbin.send_rtp_src_0 ! udpsink port=5002 host=$DEST
rtpbin.send_rtcp_src_0 ! udpsink port=5003 host=$DEST sync=false async=false
udpsrc port=5007 ! rtpbin.recv_rtcp_sink_0
这里使用到了gstrtpbin这个元件。这个元件有3个sink,3个sout。
1、编码后的数据给这个元件的send_rtp_sink,再通过send_rtp_src给udpsink发送rtp数据出去
2、同时通过send_rtcp_src发送rtcp数据给udpsink出去。
3、用recv_rtcp_sink接收udpsrc的数据。
这个server,用到4个元素。其中两个用来发送send(sink->sout)rtp数据,一个用来send(sout)rtcp数据。一个用来recv(sink)rtcp数据。
剩下两个用来接收rtp数据的元素用于client端。recv(sink->sout)。======>一共六个!剩下两个的用法是在client里面讲述。
另外:udpsink的端口,rtp和rtcp独立。udpsrc的端口,用来接收rtcp也是独立开的。
带续未完~~~~~~~~~~
You'll notice I removed all the audio chunks. At the moment I don't have a mic so what's the point. I did however leave in all the control flow (RTCP). This is nice to sync up the streams. I don't fully understand what's
going on but I have played with the timing and seen the video change using the RTCP settings, basically performance seemed to improve. In the above code, I setDEST
to the ip address or DNS name of my client.nbsp;
The first client I tested was a windows maching with VLC and this client fileclient-PCMA.sdp. Things
worked ok (as long as you open up the client first and before the connection times out you fire up the server). However, there was a pretty good lag (roughly 1 sec) in the video. So I tried out the linux sh fileclient-H264.sh and
things looked much better. I also don't know if it had anything to do with it but I setup the ntpdate package on the Overo so that it would correctly set the date (I think the RTCP uses time stamping to improve performance but I could be wrong). I figured
having the correct date/time would be good. To get it to display the correct time zone I used export TZ=PST8PDT. However I can't get it to be persistent. I tried putting it in a file at
/etc/TZ but that didn't work, I'll play around with it later.
Now I wanted to stream stuff to a windows machine. I did a quick search and foundthis website. The OSSBuild comes with some precompiled binaries, one
of which is gst-launch. With that binary I used the following command
gst-launch -v gstrtpbin name=rtpbin latency=50 udpsrc caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, sprop-parameter-sets=(string)"Z0KAHukBQHpCAAAH0AAB1MAIAA\=\=\,aM48gAA\=""
port=5000 ! rtpbin.recv_rtp_sink_0 rtpbin. ! rtph264depay ! ffdec_h264 ! ffmpegcolorspace ! autovideosink udpsrc port=5001 ! rtpbin.recv_rtcp_sink_0 rtpbin.send_rtcp_src_0 ! udpsink port=5005 host=$OVERO sync=false async=false
I set OVERO to be the ip/dns name of the Overo wifi adapter. You'll notice I added thesprop-parameter-sets
as per Scott'sinstructions so that client or server can be started first. I also messed around with thelatency
value. It's definitely dependent on the speed of the client. Some I could set low, others needed to be higher. I believe it adds a bit of lag to the video. Anyway, what I got was really good video with very little lag. Also, one of the reasons I was doing
this test was to see the CPU load. So ... with 640x480 at 15 frames per second over wifi I got about 20% cpu load.
top - 20:17:26 up 37 min, 2 users, load average: 0.08, 0.06, 0.01
Tasks: 67 total, 1 running, 66 sleeping, 0 stopped, 0 zombie
Cpu(s): 2.0%us, 0.8%sy, 0.0%ni, 95.9%id, 0.8%wa, 0.6%hi, 0.0%si, 0.0%st
Mem: 469008k total, 71380k used, 397628k free, 3200k buffers
Swap: 0k total, 0k used, 0k free, 46712k cached
PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND
1013 root 20 0 87720 7660 5132 S 19.2 1.6 0:34.72 gst-launch-0.10
This is pretty sweet, ready to test other stuff while streaming video. I still need to play with the latency. It's not very large but definitely noticable. I haven't determined if it's the client or the server. Now that I have
an LCD screen (thanks Don) I'm going to try looping back the video and see if the latency is apparent, stay tuned.
Last modified Sun, 4 Sep, 2011 at 16:41