I have spent quite a while (past week) trying this to little avail. However, what I want seems completely unheard of. So far, I have reviewed recommendations available through google, which include encoding a static file into multiple static files in different formats, creating a playlist that hosts static files in an m3u8 file (files which get added to the playlist as streaming continues).
I have also seen ideas involving rtmp, rtsp etc which are completely out of the question because of their incompatibility.
Ideally, I would have one webpage that would link to the stream (http://server/video.mp4) and/or show it in a webpage (via the video tag). With that in mind, the most likely format would be h264+aac in mp4 container.
Unfortunately, (and probably because the file has no duration metadata) it does not work. I can use a desktop player (such as VLC) to open the stream and play it, but my iPhone and Android both give their respective "Can't be played" messages.
I don't think the problem is caused by the devices' ability to stream, for I have made a streaming shoutcast server work just fine (mp3 only).
Currently, the closest I have become is using the following setup on my win32 machine:
FFMPEG Command:: ffmpeg -f dshow -i video="Logitech Webcam 200":audio="Microphone (Webcam 200)" -b:v 180k -bt 240k -vcodec libx264 -tune zerolatency -profile:v baseline -preset ultrafast -r 10 -strict -2 -acodec aac -ac 2 -ar 48000 -ab 32k -f flv "udp://127.0.0.1:1234"
VLC:: Stream from udp://127.0.0.1:1234 to http:// :8080/video.mp4 (No Transcoding), basically just to convert the UDP stream into an http-accessible stream.
Any hints or suggestions would be warmly welcomed!
Sorry i'm trying to understand your question.
It seems your trying to play mp4 on both android and iphone from your server via http right?
Do you have a streaming server? Or are you simply trying to have the phone pull the file from your server.
If you don't have one, I suggest checking out darwin streaming server (http://justdevelopment.blogspot.com/2009/10/video-streaming-with-android-phone.html).
It will allow you to set up your video to stream with the right encodings needed for each device.
Let me know if that helps
Related
I want to convert a regular video file (MP4 x264) to HLS to be streamed on Android devices using ffmpeg. I have an html5 player setup to play the HLS stream (m3u8) and have tested it on Chrome and Firefox desktop browsers and everything works fine. The issue is that the same exact player and m3u8 playlist does not work on Android Chrome and Android Firefox (both give an unsupported file type error).
At first I thought it may be an issue with the html5 player and HLS support, so I took a random m3u8 playlist stream from a quick google search, and of course that stream worked fine in my html5 player on Android. So I know it is not the html5 player and lack of HLS support. I think that I am using an incorrect codec or other parameter in ffmpeg, but after dozens of attempts I decided to ask here because I'm all out of ideas.
My original ffmpeg command is:
ffmpeg -i "test.mp4" -preset ultrafast -c:v h264 -c:a aac -f ssegment -segment_list playlist.m3u8 -segment_list_type hls -segment_list_size 0 out_%6d.ts
I've tried:
ffmpeg -i "test.mp4" -c:v libx264 -c:a aac -ac 1 -strict -2 -crf 18 -profile:v baseline -maxrate 400k -bufsize 1835k -pix_fmt yuv420p -flags -global_header -hls_time 10 -hls_list_size 6 -hls_wrap 10 -start_number 1 playlist.m3u8
ffmpeg -i "test.mp4" -c:v copy -c:a aac -ac 2 -f hls -hls_time 60 -hls_playlist_type event playlist.m3u8
ffmpeg -i "test.mp4" -c:a aac -ar 48000 -c:v h264 -profile:v main -crf 20 -sc_threshold 0 -hls_playlist_type vod -vf scale=w=640:h=360:force_original_aspect_ratio=decrease -b:v 800k -maxrate 856k -bufsize 1200k -b:a 96k -hls_segment_filename 360p_%03d.ts playlist.m3u8
ffmpeg -i "test.mp4" -c:v libx264 -c:a aac -ac 1 -strict -2 -crf 18 -profile:v baseline -maxrate 400k -bufsize 1835k -pix_fmt yuv420p -flags -global_header -hls_time 10 -hls_list_size 6 -hls_wrap 10 -start_number 1 playlist.m3u8
And have tried dozens of other combinations of patterns i.e. changing segment file sizes, setting aspect ratio, and just about any example I could find that would seem relevant. I know I probably just got one parameter wrong or perhaps it is a header issue?
As I said before, I took a sample HLS stream online and it didn't give any problems. It is only when I'm running it through ffmpeg that it doesn't work on mobile devices. I also downloaded a test mp4 file in case there was something wrong with my file, but it made no difference.
I have been tracking down where the issue lies. I've been able to narrow out it being a codec problem, because I took a sample video file and re-encoded it to mp4 (h264 aac). In the html5 player on Android it streamed correctly so long as it was in mp4 format.
I've been able to narrow it down to something related to the m3u8 playlist file itself and/or location of files. Perhaps the m3u8 playlist file is missing a parameter. Trying to compare it with others found on the internet and haven't really noticed a difference so far.
Well, I finally found the answer and of all things it was about the last thing I would have suspected: htaccess. In my effort to debug where the issue was I decided to take a sample HLS playlist found on the internet and put it on my server. I included the m3u8 and ts files accordingly and tested things out. It worked across desktop browsers (Chrome and Firefox), but once again didn't work on mobile browsers (tested on Android versions of Chrome, Firefox, and Opera). The interesting part was that when I used the m3u8 playlist from the online source it worked on mobile, but when hosted on my server (same exact files) it didn't work.
After deleted the .htacess. file, it worked again from my server on mobile! For the life of me I can't figure out why this is the issue. All there is in my .htaccess file is:
AuthType Basic
AuthName "Restrictions"
AuthUserFile /path_to_passwd/.htpasswd
<RequireAny>
Require user UserthatisValid
</RequireAny>
On the mobile device, the user is authenticated properly because they couldn't access the html5 player without providing credentials. My guess is that something in my webserver (Apache) is misconfigured or some module needs to be enabled. But why the desktop versions worked and mobile did not still doesn't make sense. Perhaps something to do with headers. I don't think it is a CORS issue either, because Chrome's Device Remote Logger would have picked that kind of issue up. Or perhaps there is something different in the way the mobile device requests a file vs. accessing a page. Either way it doesn't make sense to me why desktop versions work and mobile do not.
Will do more testing to figure out what exactly is causing the problem, but the immediate workaround is to allow direct access to the m3u8 playlist file without authentication in .htaccess or just removing it altogether. Not good for production use, but does work successfully.
I'm streaming live from my GoPro inside my android app. I use ffmpeg to receive the streaming data from the GoPro and vlc to play it in a surfaceview. I used the code which is provided by KonradIT here. The main command used for the ffmpeg is:
-fflags nobuffer -f mpegts -i udp://:8554 -f mpegts udp://127.0.0.1:8555/gopro?pkt_size=64
and the options for vlclib are:
options.add("--aout=opensles");
options.add("--audio-time-stretch");
options.add("-vvv");
The output is something worse. It's laggy and its speed is about 17 FPS. And one annoying thing is the streamed picture is very small and as far as I tried, there was no way to make it larger and stretched.
I want to know if there is any command to speedup the streaming (in anyway, even by reducing the quality) ? Either on the side of ffmpeg or vlc.
If it is only doing a relay of packets, try this :
ffmpeg -fflags nobuffer -f mpegts -i udp://:8554 -c:v copy -c:a copy -f mpegts udp://127.0.0.1:8555/gopro?pkt_size=1316
You can play with different packet sizes based on the MTU size of your network (<1500). Check the delay.
Using this command we are not transcoding the incoming packets, resize and relay.
I have been playing around with ffmpeg and video encoding and even though my mp4's work great on desktop, they are smooth etc they are terrible on mobile devices. They stutter and load very slowly and I am trying to figure out the problem.
As an example I made a page using the media element plugin: http://mediaelementjs.com/ and on it I first placed the video that comes with mediaelementjs and it worked well, it scaled to desktop and mobile and loaded quickly and played without any stutter.
However I loaded my video and it was slow and full of stutter, but only on mobile. So I thought it might be S3 (where it is hosted) but saved the file locally and same thing.
I am hoping someone who knows h.264 and/or ffmpeg can point me in the direction of why; here is the current command I am running on ffmpeg:
ffmpeg -i $input_file_name -vcodec libx264 -r 100 -bt 300k -ac 2 -ar 48000 -ab 192k -strict -2 -y $output_temp_file 2>&1
So what have I missed?
So what have I missed?
Mobile devices have a very limited computing power. You are trying to play 100fps video file - there aren't any mobile device i know that can handle such framerate.
First - change framerate to reasonable value, then adjust resolution, set encoding profile (baseline, for example), video bitrate (quality, rate-factor). After that you can try out your files.
I'm trying to encode a local/static input file (can MP4, for example) into a smaller video file (either by resizing, lesser quality video, etc.) and stream it in parallel (i.e. I can't wait for the encoding process to finish before streaming it back), so it can be played by an Android client (the standard Android video player).
So I've tried using ffmpeg as follows:
ffmpeg -re -i input.mp4 -g 52 -acodec libvo_aacenc -ab 64k -vcodec libx264 -vb 448k -f mp4 -movflags frag_keyframe+empty_moov -
Notice I'm using stdout as the output so I can run ffmpeg and stream its output on the fly
However, such methods (and other similar methods) don't seem to work on Android - it can't simply play it once it receives "non-standard" files (such as a fragmented MP4s) - it seems like the empty moov atom messes it up.
I also tried other container formats, such as 3GPP and WebM.
I'd love to hear any kind of input on this issue...
Thanks
You can specify multiple outputs in ffmpeg, see here http://trac.ffmpeg.org/wiki/Creating%20multiple%20outputs
For Android if newer than 3.0 try HLS as an output
I am creating app to send audio to icecast server. Here I connected to icecast server and sending the audio data to server. But Streaming seen on server, but it is not playing in browser Why?
Is there need to encode that data to any format?
If you're having problems with a web-based player you need to be a lot more specific about what you're trying to do and provide some code for us to look at.
You should also confirm the stream is playing correctly in a compatible desktop app first.
Go to http://example.com:8000/admin and verify the mount point you're sending to is listed (which indicates it has a connected source) - if it is there will be an M3U (for streaming MP3) and XSPF (for Ogg) link on the right which you can use to test.
Playback of Icecast streams on Android requires installation of the Just Playlists app or an equivalent.
The problem was i am sending the audio PCM data to server.Now i encoded that PCM data to ogg format to send on server.Now it's working perfectly.
The problem is with the audio encoding. Set encoding data as follows
Capture device inputs
ffmpeg -list_devices true -f dshow -i dummy
Set input from device or other protocol.
Example:
ffplay -f dshow -i audio="Microphone (FaceCam 1000X)"
Then stream
ffmpeg -stats -report -f dshow -i audio="Microphone (FaceCam 1000X)" -c:a flac -compression_level 10 -ar 192000 -legacy_icecast 1 -content_type application/ogg -ice_name "Optional Name" -f ogg icecast://source:password_for_streaming#127.0.0.1:8000/live.ogg