Transcoding and Streaming a video file for Android - android

I'm trying to encode a local/static input file (can MP4, for example) into a smaller video file (either by resizing, lesser quality video, etc.) and stream it in parallel (i.e. I can't wait for the encoding process to finish before streaming it back), so it can be played by an Android client (the standard Android video player).
So I've tried using ffmpeg as follows:
ffmpeg -re -i input.mp4 -g 52 -acodec libvo_aacenc -ab 64k -vcodec libx264 -vb 448k -f mp4 -movflags frag_keyframe+empty_moov -
Notice I'm using stdout as the output so I can run ffmpeg and stream its output on the fly
However, such methods (and other similar methods) don't seem to work on Android - it can't simply play it once it receives "non-standard" files (such as a fragmented MP4s) - it seems like the empty moov atom messes it up.
I also tried other container formats, such as 3GPP and WebM.
I'd love to hear any kind of input on this issue...
Thanks

You can specify multiple outputs in ffmpeg, see here http://trac.ffmpeg.org/wiki/Creating%20multiple%20outputs
For Android if newer than 3.0 try HLS as an output

Related

What parameters are necessary in FFMPEG to stream (HLS) to modern Android devices?

I want to convert a regular video file (MP4 x264) to HLS to be streamed on Android devices using ffmpeg. I have an html5 player setup to play the HLS stream (m3u8) and have tested it on Chrome and Firefox desktop browsers and everything works fine. The issue is that the same exact player and m3u8 playlist does not work on Android Chrome and Android Firefox (both give an unsupported file type error).
At first I thought it may be an issue with the html5 player and HLS support, so I took a random m3u8 playlist stream from a quick google search, and of course that stream worked fine in my html5 player on Android. So I know it is not the html5 player and lack of HLS support. I think that I am using an incorrect codec or other parameter in ffmpeg, but after dozens of attempts I decided to ask here because I'm all out of ideas.
My original ffmpeg command is:
ffmpeg -i "test.mp4" -preset ultrafast -c:v h264 -c:a aac -f ssegment -segment_list playlist.m3u8 -segment_list_type hls -segment_list_size 0 out_%6d.ts
I've tried:
ffmpeg -i "test.mp4" -c:v libx264 -c:a aac -ac 1 -strict -2 -crf 18 -profile:v baseline -maxrate 400k -bufsize 1835k -pix_fmt yuv420p -flags -global_header -hls_time 10 -hls_list_size 6 -hls_wrap 10 -start_number 1 playlist.m3u8
ffmpeg -i "test.mp4" -c:v copy -c:a aac -ac 2 -f hls -hls_time 60 -hls_playlist_type event playlist.m3u8
ffmpeg -i "test.mp4" -c:a aac -ar 48000 -c:v h264 -profile:v main -crf 20 -sc_threshold 0 -hls_playlist_type vod -vf scale=w=640:h=360:force_original_aspect_ratio=decrease -b:v 800k -maxrate 856k -bufsize 1200k -b:a 96k -hls_segment_filename 360p_%03d.ts playlist.m3u8
ffmpeg -i "test.mp4" -c:v libx264 -c:a aac -ac 1 -strict -2 -crf 18 -profile:v baseline -maxrate 400k -bufsize 1835k -pix_fmt yuv420p -flags -global_header -hls_time 10 -hls_list_size 6 -hls_wrap 10 -start_number 1 playlist.m3u8
And have tried dozens of other combinations of patterns i.e. changing segment file sizes, setting aspect ratio, and just about any example I could find that would seem relevant. I know I probably just got one parameter wrong or perhaps it is a header issue?
As I said before, I took a sample HLS stream online and it didn't give any problems. It is only when I'm running it through ffmpeg that it doesn't work on mobile devices. I also downloaded a test mp4 file in case there was something wrong with my file, but it made no difference.
I have been tracking down where the issue lies. I've been able to narrow out it being a codec problem, because I took a sample video file and re-encoded it to mp4 (h264 aac). In the html5 player on Android it streamed correctly so long as it was in mp4 format.
I've been able to narrow it down to something related to the m3u8 playlist file itself and/or location of files. Perhaps the m3u8 playlist file is missing a parameter. Trying to compare it with others found on the internet and haven't really noticed a difference so far.
Well, I finally found the answer and of all things it was about the last thing I would have suspected: htaccess. In my effort to debug where the issue was I decided to take a sample HLS playlist found on the internet and put it on my server. I included the m3u8 and ts files accordingly and tested things out. It worked across desktop browsers (Chrome and Firefox), but once again didn't work on mobile browsers (tested on Android versions of Chrome, Firefox, and Opera). The interesting part was that when I used the m3u8 playlist from the online source it worked on mobile, but when hosted on my server (same exact files) it didn't work.
After deleted the .htacess. file, it worked again from my server on mobile! For the life of me I can't figure out why this is the issue. All there is in my .htaccess file is:
AuthType Basic
AuthName "Restrictions"
AuthUserFile /path_to_passwd/.htpasswd
<RequireAny>
Require user UserthatisValid
</RequireAny>
On the mobile device, the user is authenticated properly because they couldn't access the html5 player without providing credentials. My guess is that something in my webserver (Apache) is misconfigured or some module needs to be enabled. But why the desktop versions worked and mobile did not still doesn't make sense. Perhaps something to do with headers. I don't think it is a CORS issue either, because Chrome's Device Remote Logger would have picked that kind of issue up. Or perhaps there is something different in the way the mobile device requests a file vs. accessing a page. Either way it doesn't make sense to me why desktop versions work and mobile do not.
Will do more testing to figure out what exactly is causing the problem, but the immediate workaround is to allow direct access to the m3u8 playlist file without authentication in .htaccess or just removing it altogether. Not good for production use, but does work successfully.

How to speed up ffmpeg streaming on android using libvlc?

I'm streaming live from my GoPro inside my android app. I use ffmpeg to receive the streaming data from the GoPro and vlc to play it in a surfaceview. I used the code which is provided by KonradIT here. The main command used for the ffmpeg is:
-fflags nobuffer -f mpegts -i udp://:8554 -f mpegts udp://127.0.0.1:8555/gopro?pkt_size=64
and the options for vlclib are:
options.add("--aout=opensles");
options.add("--audio-time-stretch");
options.add("-vvv");
The output is something worse. It's laggy and its speed is about 17 FPS. And one annoying thing is the streamed picture is very small and as far as I tried, there was no way to make it larger and stretched.
I want to know if there is any command to speedup the streaming (in anyway, even by reducing the quality) ? Either on the side of ffmpeg or vlc.
If it is only doing a relay of packets, try this :
ffmpeg -fflags nobuffer -f mpegts -i udp://:8554 -c:v copy -c:a copy -f mpegts udp://127.0.0.1:8555/gopro?pkt_size=1316
You can play with different packet sizes based on the MTU size of your network (<1500). Check the delay.
Using this command we are not transcoding the incoming packets, resize and relay.

How to compress video in android faster?

I have recently used ffmpeg library for android to compress the video of length 10 seconds and size nearly 25 MB. Following are the commands i tried to use:
ffmpeg -i /video.mp4 -vcodec h264 -b:v 1000k -acodec mp2 /output.mp4
OR
ffmpeg -i input.mp4 -vcodec h264 -crf 20 output.mp4
Both of the commands were too slow. I canceled the task before it completed because it was taking too much time. It took more than 8 minutes to process JUST 20% of the video. Time is really critical for me so i can't opt for ffmpeg. I have following question:
Is there something wrong with the command or ffmpeg is slow anyway?
If its slow then is there any other well documented and reliable way/library for video compression that i can use in android?
Your file is in mp4 container and already has its streams in some predefined codec.
Now the size of any container(not specifically mp4) will depend upon what kind of compression(loosely codec) is used for compressing the data. This is why you will see different size for the same content in different formats.
There are other parameters which can affect the size of the file i.e frame rate, resolution, audio bit rate etc. If you reduce them then the size of file becomes less. e.g. in youtube you can choose to play video at a lower rate when bandwidth is the issue.
However, if you choose to do this you will have to re-process the entire file again and its going to to take a lot of time since you are demuxing the container, decoding the codec, applying filter(reducing frame etc), then recording, and then again remuxing. This entire process if not worth few extra MB of saving unless you have some compelling use case.
One solution is to use a more powerful machine but again this is limited by the architecture/constraint of the application to utilize powerful machine. To answer specifically for ffmpeg it wont make much difference.

Android FFmpeg convert on the fly

I am using ffmpeg library in my project with custom VideoPlayer. I can convert video from .avi to .mp4, but it takes more time to convert. Can I convert video on the fly using ffmpeg?
EDIT
So I want to play videos with different formats, that standard MediaPlayer doesn't support now. I've built ffmpeg for my project, now I have ffmpeg.so file and I can convert video through ffmpeg, for example:
ffmpeg input.avi -vcodec mpeg4 -acodec aac -strict -2 output.mp4
But it can spend more time, if video is big. How can I realize different formats support using ffmpeg?
Yes, you can convert video on the fly using ffmpeg (it would actually be called transcoding).
Here's an older guide written for linux. All of it should still work on Android.

Truly live streaming to Android/iPhone

I have spent quite a while (past week) trying this to little avail. However, what I want seems completely unheard of. So far, I have reviewed recommendations available through google, which include encoding a static file into multiple static files in different formats, creating a playlist that hosts static files in an m3u8 file (files which get added to the playlist as streaming continues).
I have also seen ideas involving rtmp, rtsp etc which are completely out of the question because of their incompatibility.
Ideally, I would have one webpage that would link to the stream (http://server/video.mp4) and/or show it in a webpage (via the video tag). With that in mind, the most likely format would be h264+aac in mp4 container.
Unfortunately, (and probably because the file has no duration metadata) it does not work. I can use a desktop player (such as VLC) to open the stream and play it, but my iPhone and Android both give their respective "Can't be played" messages.
I don't think the problem is caused by the devices' ability to stream, for I have made a streaming shoutcast server work just fine (mp3 only).
Currently, the closest I have become is using the following setup on my win32 machine:
FFMPEG Command:: ffmpeg -f dshow -i video="Logitech Webcam 200":audio="Microphone (Webcam 200)" -b:v 180k -bt 240k -vcodec libx264 -tune zerolatency -profile:v baseline -preset ultrafast -r 10 -strict -2 -acodec aac -ac 2 -ar 48000 -ab 32k -f flv "udp://127.0.0.1:1234"
VLC:: Stream from udp://127.0.0.1:1234 to http:// :8080/video.mp4 (No Transcoding), basically just to convert the UDP stream into an http-accessible stream.
Any hints or suggestions would be warmly welcomed!
Sorry i'm trying to understand your question.
It seems your trying to play mp4 on both android and iphone from your server via http right?
Do you have a streaming server? Or are you simply trying to have the phone pull the file from your server.
If you don't have one, I suggest checking out darwin streaming server (http://justdevelopment.blogspot.com/2009/10/video-streaming-with-android-phone.html).
It will allow you to set up your video to stream with the right encodings needed for each device.
Let me know if that helps

Categories

Resources