How to speed up ffmpeg streaming on android using libvlc? - android

I'm streaming live from my GoPro inside my android app. I use ffmpeg to receive the streaming data from the GoPro and vlc to play it in a surfaceview. I used the code which is provided by KonradIT here. The main command used for the ffmpeg is:
-fflags nobuffer -f mpegts -i udp://:8554 -f mpegts udp://127.0.0.1:8555/gopro?pkt_size=64
and the options for vlclib are:
options.add("--aout=opensles");
options.add("--audio-time-stretch");
options.add("-vvv");
The output is something worse. It's laggy and its speed is about 17 FPS. And one annoying thing is the streamed picture is very small and as far as I tried, there was no way to make it larger and stretched.
I want to know if there is any command to speedup the streaming (in anyway, even by reducing the quality) ? Either on the side of ffmpeg or vlc.

If it is only doing a relay of packets, try this :
ffmpeg -fflags nobuffer -f mpegts -i udp://:8554 -c:v copy -c:a copy -f mpegts udp://127.0.0.1:8555/gopro?pkt_size=1316
You can play with different packet sizes based on the MTU size of your network (<1500). Check the delay.
Using this command we are not transcoding the incoming packets, resize and relay.

Related

What parameters are necessary in FFMPEG to stream (HLS) to modern Android devices?

I want to convert a regular video file (MP4 x264) to HLS to be streamed on Android devices using ffmpeg. I have an html5 player setup to play the HLS stream (m3u8) and have tested it on Chrome and Firefox desktop browsers and everything works fine. The issue is that the same exact player and m3u8 playlist does not work on Android Chrome and Android Firefox (both give an unsupported file type error).
At first I thought it may be an issue with the html5 player and HLS support, so I took a random m3u8 playlist stream from a quick google search, and of course that stream worked fine in my html5 player on Android. So I know it is not the html5 player and lack of HLS support. I think that I am using an incorrect codec or other parameter in ffmpeg, but after dozens of attempts I decided to ask here because I'm all out of ideas.
My original ffmpeg command is:
ffmpeg -i "test.mp4" -preset ultrafast -c:v h264 -c:a aac -f ssegment -segment_list playlist.m3u8 -segment_list_type hls -segment_list_size 0 out_%6d.ts
I've tried:
ffmpeg -i "test.mp4" -c:v libx264 -c:a aac -ac 1 -strict -2 -crf 18 -profile:v baseline -maxrate 400k -bufsize 1835k -pix_fmt yuv420p -flags -global_header -hls_time 10 -hls_list_size 6 -hls_wrap 10 -start_number 1 playlist.m3u8
ffmpeg -i "test.mp4" -c:v copy -c:a aac -ac 2 -f hls -hls_time 60 -hls_playlist_type event playlist.m3u8
ffmpeg -i "test.mp4" -c:a aac -ar 48000 -c:v h264 -profile:v main -crf 20 -sc_threshold 0 -hls_playlist_type vod -vf scale=w=640:h=360:force_original_aspect_ratio=decrease -b:v 800k -maxrate 856k -bufsize 1200k -b:a 96k -hls_segment_filename 360p_%03d.ts playlist.m3u8
ffmpeg -i "test.mp4" -c:v libx264 -c:a aac -ac 1 -strict -2 -crf 18 -profile:v baseline -maxrate 400k -bufsize 1835k -pix_fmt yuv420p -flags -global_header -hls_time 10 -hls_list_size 6 -hls_wrap 10 -start_number 1 playlist.m3u8
And have tried dozens of other combinations of patterns i.e. changing segment file sizes, setting aspect ratio, and just about any example I could find that would seem relevant. I know I probably just got one parameter wrong or perhaps it is a header issue?
As I said before, I took a sample HLS stream online and it didn't give any problems. It is only when I'm running it through ffmpeg that it doesn't work on mobile devices. I also downloaded a test mp4 file in case there was something wrong with my file, but it made no difference.
I have been tracking down where the issue lies. I've been able to narrow out it being a codec problem, because I took a sample video file and re-encoded it to mp4 (h264 aac). In the html5 player on Android it streamed correctly so long as it was in mp4 format.
I've been able to narrow it down to something related to the m3u8 playlist file itself and/or location of files. Perhaps the m3u8 playlist file is missing a parameter. Trying to compare it with others found on the internet and haven't really noticed a difference so far.
Well, I finally found the answer and of all things it was about the last thing I would have suspected: htaccess. In my effort to debug where the issue was I decided to take a sample HLS playlist found on the internet and put it on my server. I included the m3u8 and ts files accordingly and tested things out. It worked across desktop browsers (Chrome and Firefox), but once again didn't work on mobile browsers (tested on Android versions of Chrome, Firefox, and Opera). The interesting part was that when I used the m3u8 playlist from the online source it worked on mobile, but when hosted on my server (same exact files) it didn't work.
After deleted the .htacess. file, it worked again from my server on mobile! For the life of me I can't figure out why this is the issue. All there is in my .htaccess file is:
AuthType Basic
AuthName "Restrictions"
AuthUserFile /path_to_passwd/.htpasswd
<RequireAny>
Require user UserthatisValid
</RequireAny>
On the mobile device, the user is authenticated properly because they couldn't access the html5 player without providing credentials. My guess is that something in my webserver (Apache) is misconfigured or some module needs to be enabled. But why the desktop versions worked and mobile did not still doesn't make sense. Perhaps something to do with headers. I don't think it is a CORS issue either, because Chrome's Device Remote Logger would have picked that kind of issue up. Or perhaps there is something different in the way the mobile device requests a file vs. accessing a page. Either way it doesn't make sense to me why desktop versions work and mobile do not.
Will do more testing to figure out what exactly is causing the problem, but the immediate workaround is to allow direct access to the m3u8 playlist file without authentication in .htaccess or just removing it altogether. Not good for production use, but does work successfully.

FFmpeg - Add frames to MP4 in multiple calls

I'm trying to use FFmpeg to create an MP4 file from user-created content in an Android app.
The user-created content is rendered with OpenGL. I was thinking I can render it to a FrameBuffer and save it as a temporary image file. Then, take this image file and add it to an MP4 file with FFmpeg. I'd do this frame by frame, creating and deleting these temporary image files as I go while I build the MP4 file.
The issue is I will never have all of these image files at one time, so I can't just use the typical call:
ffmpeg -start_number n -i test_%d.jpg -vcodec mpeg4 test.mp4
Is this possible with FFmpeg? I can't find any information about adding frames to an MP4 file one-by-one and keeping the correct framerate, etc...
Use STDIO to get the raw frames to FFmpeg. Note that this doesn't mean exporting entire images... all you need is the pixel data. Something like this...
ffmpeg -f rawvideo -vcodec rawvideo -s 1920x1080 -pix_fmt rgb24 -r 30 -i - -vcodec mpeg4 test.mp4
Using -i - means FFmpeg will read from the pipe.
I think from there you would just send in the raw pixel values via the pipe, one byte per color per pixel. FFmpeg will know when you're done with each frame since you've passed the frame size to it.

Transcoding and Streaming a video file for Android

I'm trying to encode a local/static input file (can MP4, for example) into a smaller video file (either by resizing, lesser quality video, etc.) and stream it in parallel (i.e. I can't wait for the encoding process to finish before streaming it back), so it can be played by an Android client (the standard Android video player).
So I've tried using ffmpeg as follows:
ffmpeg -re -i input.mp4 -g 52 -acodec libvo_aacenc -ab 64k -vcodec libx264 -vb 448k -f mp4 -movflags frag_keyframe+empty_moov -
Notice I'm using stdout as the output so I can run ffmpeg and stream its output on the fly
However, such methods (and other similar methods) don't seem to work on Android - it can't simply play it once it receives "non-standard" files (such as a fragmented MP4s) - it seems like the empty moov atom messes it up.
I also tried other container formats, such as 3GPP and WebM.
I'd love to hear any kind of input on this issue...
Thanks
You can specify multiple outputs in ffmpeg, see here http://trac.ffmpeg.org/wiki/Creating%20multiple%20outputs
For Android if newer than 3.0 try HLS as an output

Truly live streaming to Android/iPhone

I have spent quite a while (past week) trying this to little avail. However, what I want seems completely unheard of. So far, I have reviewed recommendations available through google, which include encoding a static file into multiple static files in different formats, creating a playlist that hosts static files in an m3u8 file (files which get added to the playlist as streaming continues).
I have also seen ideas involving rtmp, rtsp etc which are completely out of the question because of their incompatibility.
Ideally, I would have one webpage that would link to the stream (http://server/video.mp4) and/or show it in a webpage (via the video tag). With that in mind, the most likely format would be h264+aac in mp4 container.
Unfortunately, (and probably because the file has no duration metadata) it does not work. I can use a desktop player (such as VLC) to open the stream and play it, but my iPhone and Android both give their respective "Can't be played" messages.
I don't think the problem is caused by the devices' ability to stream, for I have made a streaming shoutcast server work just fine (mp3 only).
Currently, the closest I have become is using the following setup on my win32 machine:
FFMPEG Command:: ffmpeg -f dshow -i video="Logitech Webcam 200":audio="Microphone (Webcam 200)" -b:v 180k -bt 240k -vcodec libx264 -tune zerolatency -profile:v baseline -preset ultrafast -r 10 -strict -2 -acodec aac -ac 2 -ar 48000 -ab 32k -f flv "udp://127.0.0.1:1234"
VLC:: Stream from udp://127.0.0.1:1234 to http:// :8080/video.mp4 (No Transcoding), basically just to convert the UDP stream into an http-accessible stream.
Any hints or suggestions would be warmly welcomed!
Sorry i'm trying to understand your question.
It seems your trying to play mp4 on both android and iphone from your server via http right?
Do you have a streaming server? Or are you simply trying to have the phone pull the file from your server.
If you don't have one, I suggest checking out darwin streaming server (http://justdevelopment.blogspot.com/2009/10/video-streaming-with-android-phone.html).
It will allow you to set up your video to stream with the right encodings needed for each device.
Let me know if that helps

Video playing in Android tablet

I want to play a large video using HTTP on an Android tablet.
I don't want to save that viedo on the device. That is, if the large video data received by the web service is in little chunks, I don't want to save that binary data and later play the video. I want play the video as it downloads.
Is this possible?
Simply encode the video using FFMPEG and then use the qt-faststart tool to enable streaming. I use a command like this to encode videos for android phones
$ ffmpeg -i infile.mp4 -s 480x320 -threads 4 -vcodec libx264 -flags +loop+mv4 -cmp 256 -partitions +parti4x4+parti8x8+partp4x4+partp8x8+partb8x8 -subq 5 -trellis 1 -refs 5 -bf 0 -flags2 +mixed_refs -coder 0 -me_range 16 -g 250 -keyint_min 25 -sc_threshold 40 -i_qfactor 0.71 -qmin 15 -qmax 20 -qdiff 5 -b 700k -r 600 -acodec libfaac -ar 48000 -ab 48000 -pass 1 outfile.mp4
My code will be kind of Low Quality for tablets. So, experiment by changing the resolution, bitrate etc.
Yes, it is called buffering.
You can read the tutorial Media Playback. Basically you will need the MediaPlayer class, and then set the URL of your video as the data source, and MediaPlayer would do all the complicated stuff.

Categories

Resources