开发者

Android live video streaming issue

I have a question about video streaming from Android devices and maybe someone who will give me some useful advice/suggestions about my question.So here is the deal :

I have a project for video streaming from Android device.The idea is that to connect to devices to a server and the first one is streaming live video and upload it to server and the second device is streaming and watching the video from the first device. So there is connection like 开发者_StackOverflow中文版this :

First Device ----live streaming----> Web Server ------live streaming---->Second Device , where second device is connecting to the web server.

Any suggestions/advice how can I do that and what should I use?I will be really glad to hear your ideas.

Thanks in advance!


I was playing with something similar. Basically:

android video capture --> upstream to server --> transcoding --> streaming to players
  • the video is captured by a mobile phone (currently only a proof of concept for Android) and delivered to a server
  • the server performs transcoding from the original format (H.263 and AMR-NB within a .3gp container) to a Flash video, so that it can be played in vast majority of browsers

My biggest issue is that with the H.263, I'm not able to upstream it live to the server. The video has a header, which indicates among others its length and other info. Then, in order the video can be transcoded (I use ffmpeg), the header must be present. But it is set by Android AFTER the video capturing is finished. So as a work around, I'm capturing the video in e.g. 5 second slices.

Take a look at http://code.google.com/p/moteve and feel free to contribute :-)

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜