Android how to video record, upload, transcode, download, play - android

I'm researching the development of an Android (2.2) app/service that will enable users to record short (I do emphasize short, < 30seconds) video on their phones and then upload that video (HTTP) to a server that will then transcode the video to other formats. That same user can download videos from other Android users and play them.
Now, I get a bit lost with everyones recommended approaches to all the issues in doing something like this because I haven't seen any ask this in a cohesive context. Ideally I would like a non commercial solution to this (as in no vendor/service being needed for the the video hosting/transcoding), but, feel free to include those as a recommendation (I've marked this as a wiki) as I know many like to use youtube and vimeo for the middle layer in all this.
The questions are
What server technologies do you
recommend for hosting and
transcoding?
What technology do you
recommend for streaming the video (it
would be nice to offer a high and
low quality encoding depending on
the users network connection)
What video format and software do you recommend for converting the uploaded video on the server to be viewable later by other Android owners.
Im assuming it's bad to do any transcoding on the phone prior to upload (battery/proc issues), but, if I'm wrong with that assumption what do you recommend?
Some things that may help you...
The video will only need to render on an Android device, and in the future in a webkit html5 browser.
Bandwidth isnt cheap (even with numerous 30 second videos), so a good mix of video quality and video file size is important (streaming if needed to ensure quality vs. download).
This is for android 2.2 devices with a video camera of course and medium to high density screen of 800x400 min.
Open source solutions (server to receive the uploads, code to do the transcoding, server to do the streaming) are preferred, but not required.
CDN's are an option, but I don't think that really figures in to the picture right now.

Check out this page to see all the video formats that Android supports for encoding and decoding.
http://developer.android.com/guide/appendix/media-formats.html
For encoding use FFmpeg or a service like encoding.com

Related

what audio format is natively supported in all platforms, both for recording and playing back?

We're creating a range of apps that record user's voice for a wide range of applications. Users can register their ideas, or describe a scene, or give educational tips and notes to someone else.
We need to choose a file format that satisfies these conditions:
Better to be playable natively in Android, iOS and web
Better to reduce the cost of encoding-decoding
Better to reduce the cost of development (we're not sound experts)
Storage is not a big deal, so compression is not important, but network traffic IS a big deal, so for that reason better to be as compact as possible
The most obvious choice coming to mind is MP3, but to our surprise, MP3 encoding is not supported in Android Studio out of the box.
We searched and tried to find best practices for this, and again, to our surprise there is not much written in spite of huge usage of sounds and voices everywhere.
For example, in this post it's written that MP3 is the most used file format, and then ACC. But we're totally stranger with AAC.
So, what audio format is natively supported in all medias, both for recording and playing back?
The file format can be .AAC, is compressed, compatible with iOS, Web and Android (3.1 or higher) and is developed by Nokia and Sony (This last one is extra information).
You can see at wikipedia all its compatible OS: AAC Wikipedia
It is compatible with WebOS.

Most instant way to stream live video to iOS and Android

I'm making an app that needs to send a video feed from a single source to a server where it can be accessed by desktop browsers and mobile apps.
So far, I've been using Adobe Media Server 5 with a live RTMP stream. This gives me about a 2.5 second delay on desktop browsers, which gives me no native support for iOS, but leaves me with the option to use Air to export the app for iOS, which produces a minimum 5-6 second delay.
The iOS docs strongly recommend the use of HTTP Live Streaming which segments the stream into chunks and serves it using a dynamic playlist in a .m3u8 file. Doing this produces a 15+ second delay in desktop browsers and mobile devices. A Google search seemed to reveal that this is to be expected from HLS.
I need a maximum of 2-4 second delays across all devices, if possible. I've gotten poor results with Wowza, but am open to revisiting it. FFMpeg seems inefficient, but I'm open to that as well, if someone has had good results with it. Anybody have any suggestions?? Thanks in advance.
I haven't even begun to find the most efficient way to stream to Android, so any help in that department would be much appreciated.
EDIT: Just to be clear, my plan is to make an iOS app, whether it's written natively or in Air. Same goes for Android, but I've yet to start on that.
In the ios browser HLS is the only way to serve live video. The absolute lowest latency would be to use 2 second segments with a 2 segment windows in the manifest. This will give you 4 seconds latency on the client, plus another 2 to 4 on the server. There is no way to do better without writing an app.
15 Second delay for HLS streams is pretty good, to provide lower latency you need to use a different streaming protocol.
RTP/RTSP will give you the lowest latency and is typically used for VoIP and video conferencing, but you will find it very difficult to use over multiple mobile and WiFi networks (some of them unintentionally block RTP).
If you can write an iOS app that supports RTMP then that is the easiest way to go and should work on Android too (only old Androids support Flash/RTMP natively). Decoding in software will result in poor battery life. There are other iOS apps that don't use HLS for streaming, but I think you need to limit it to your service (not a generic video player).
Also please remember that higher latency equals higher video quality, less buffering, better user experience etc. so don't unnecessarily reduce latency.

transcoding movie files between android and ios devices

I have an iOS and Android app that allows users to capture videos and post on our server, then allows them to email the videos to other members.
Problem is videos taken from iOS devices do not play on Android devices, I think the reverse is not a problem.
Does anyone know any server side video transcoding tools that are pretty easy to set up so that i can convert all videos into a common format that will play on any device?
Take a look at FFMPEG.
You can use it in your back-end system. This task should be asynchronous as it can be time consuming. Also you need to limit the size of the media about to be transcoded, and their number, as this kind of tasks can rapidly cause important performance issues.
On the other hand, the Android user may play the media shared by an iOS user with many of the free FFMPEG enabled players on the Play Store such as MXPLayer. So using or building a whole server side transcoding system might be an important overhead.

Format of video url

I have a video url (either progressive download or streaming) and I want to determine its format to check if it complies with Android's supported media formats. How can I check this? Is it enough if it works on the emulator?
(as far as I understand testing it on a device is not a warranty since some Android devices might support formats that others do not)
Testing it on device is not a warranty as you said. I agree. Documentation itself suggests us to test on as many different devices as possible, and this is something we can only learn through experience.
As of now I haven't come across a method in MediaPlayer/VideoView which could extract meta information of Videos.
can you have a server side script which tells you the video's details?

Server for broadcasting RTSP video to Android

I am new to video streaming and am working on a project to broadcast video to android phone over internet, and users to view the video at the same time may reach 100.
After looking around for a while I think using rtsp streaming for the phone client may be convenient(Am I right?) and so I have to choose a server, My current choice will be using
VLC
Darwin Streaming Server
Are they suitable? Or any other better choice?
How about the performance of these two servers while 100 users accessing at the same time.
Thanks in advance
Regards
Bolton
RTSP streaming in H.264/AAC would be the most convenient way to reach Android devices. End-users will not need to install an app or open the stream in one - the native media player will seamlessly open the stream.
If you intend on using VLC for the encoding portion - you may want to reconsider, as I'm not sure it supports H.264/AAC compression, which is required to reach Android devices. You may want to consider using commercial software like Wirecast or the free Flash Media Encoder with the AAC plugin.
Darwin Streaming Server is stable enough to handle that load (100 concurrent viewers), however the amount of throughput you have available and the bit-rate you will be broadcasting at are more important factors to consider when delivering video. In other words - your upload speed has to be able to be sufficient. If it's not intended strictly as a DIY project, I would suggest tapping into a commercial CDN's network (I would recommend NetroMedia).

Categories

Resources