I have the following stream mms://77.238.11.5:8080, you can access it using Windows Mediaplayer.
I don't find any solution to view it on Android devices using MediaPlayer or VideoView, so my idea is to convert is using VLC or FFMPEG to a different format like MP4 or else.
You can use ffmpeg to convert the stream to a file:
ffmpeg -i "mmsh://77.238.11.5:8080/" output.mp4
Option -t duration limits the length of the video, if needed.
They say that VPlayer can play such streams.
Related
I am using ffmpeg library in my project with custom VideoPlayer. I can convert video from .avi to .mp4, but it takes more time to convert. Can I convert video on the fly using ffmpeg?
EDIT
So I want to play videos with different formats, that standard MediaPlayer doesn't support now. I've built ffmpeg for my project, now I have ffmpeg.so file and I can convert video through ffmpeg, for example:
ffmpeg input.avi -vcodec mpeg4 -acodec aac -strict -2 output.mp4
But it can spend more time, if video is big. How can I realize different formats support using ffmpeg?
Yes, you can convert video on the fly using ffmpeg (it would actually be called transcoding).
Here's an older guide written for linux. All of it should still work on Android.
I'm trying to encode a local/static input file (can MP4, for example) into a smaller video file (either by resizing, lesser quality video, etc.) and stream it in parallel (i.e. I can't wait for the encoding process to finish before streaming it back), so it can be played by an Android client (the standard Android video player).
So I've tried using ffmpeg as follows:
ffmpeg -re -i input.mp4 -g 52 -acodec libvo_aacenc -ab 64k -vcodec libx264 -vb 448k -f mp4 -movflags frag_keyframe+empty_moov -
Notice I'm using stdout as the output so I can run ffmpeg and stream its output on the fly
However, such methods (and other similar methods) don't seem to work on Android - it can't simply play it once it receives "non-standard" files (such as a fragmented MP4s) - it seems like the empty moov atom messes it up.
I also tried other container formats, such as 3GPP and WebM.
I'd love to hear any kind of input on this issue...
Thanks
You can specify multiple outputs in ffmpeg, see here http://trac.ffmpeg.org/wiki/Creating%20multiple%20outputs
For Android if newer than 3.0 try HLS as an output
I have spent quite a while (past week) trying this to little avail. However, what I want seems completely unheard of. So far, I have reviewed recommendations available through google, which include encoding a static file into multiple static files in different formats, creating a playlist that hosts static files in an m3u8 file (files which get added to the playlist as streaming continues).
I have also seen ideas involving rtmp, rtsp etc which are completely out of the question because of their incompatibility.
Ideally, I would have one webpage that would link to the stream (http://server/video.mp4) and/or show it in a webpage (via the video tag). With that in mind, the most likely format would be h264+aac in mp4 container.
Unfortunately, (and probably because the file has no duration metadata) it does not work. I can use a desktop player (such as VLC) to open the stream and play it, but my iPhone and Android both give their respective "Can't be played" messages.
I don't think the problem is caused by the devices' ability to stream, for I have made a streaming shoutcast server work just fine (mp3 only).
Currently, the closest I have become is using the following setup on my win32 machine:
FFMPEG Command:: ffmpeg -f dshow -i video="Logitech Webcam 200":audio="Microphone (Webcam 200)" -b:v 180k -bt 240k -vcodec libx264 -tune zerolatency -profile:v baseline -preset ultrafast -r 10 -strict -2 -acodec aac -ac 2 -ar 48000 -ab 32k -f flv "udp://127.0.0.1:1234"
VLC:: Stream from udp://127.0.0.1:1234 to http:// :8080/video.mp4 (No Transcoding), basically just to convert the UDP stream into an http-accessible stream.
Any hints or suggestions would be warmly welcomed!
Sorry i'm trying to understand your question.
It seems your trying to play mp4 on both android and iphone from your server via http right?
Do you have a streaming server? Or are you simply trying to have the phone pull the file from your server.
If you don't have one, I suggest checking out darwin streaming server (http://justdevelopment.blogspot.com/2009/10/video-streaming-with-android-phone.html).
It will allow you to set up your video to stream with the right encodings needed for each device.
Let me know if that helps
i have saved the mjpeg stream to the sdcard as xxx.mjpeg .However, the mjpeg video file was not supported in android. so how could i encode mjpeg video into 3gp or mp4 format and then store them on sdcard ,at last ,i can play back the 3gp or mp4 video on my android phone ,thanks in advance.
I am not aware of state of mobo's ffmpeg source out there. I had built it long back.
I tried the rockplayer's ffmpeg port. This has hassle free build.
I was able to build it today successfully on NDK_r4b.
You can download the source from here : http://www.rockplayer.com/tech_en.html
modify the config.mk to change your tool paths and run build_andriod.sh (spelling is wrong, but it works :) )
let me know how it goes
There is no way you can achieve this with current Android API.
You need to encode the frames using an encoder in C++ and pass your bitmaps to the encoder via JNI.
You can start with MoboPlayer's ffmpeg port. You may find the download link to their ffmpeg port at the bottom of this page
If you have the image sequence in Bitmaps, you can access the Bitmap's buffer from JNI using the AndroidBitmap_* methods and pass it on to ffmpeg for encoding
I have successfully ported ffmpeg library to Android by using Bambuser's ffmpeg port. I'm currently investigating the ffmpeg's source codes especially ffplay.c and api-examples.c files.
I want to extract elementary streams from Android 2.2 recorded videos. For example I can record a H.263 encoded video in the MPEG-4 container. Lets say; test.mp4 file.
What I want to achieve is, I want to extract H.263 elementary stream video from the test.mp4 file something like test.h263.
It can be extracted by using the CLI of the ffmpeg. Just like;
ffmpeg -i test.mp4 -vcodec copy test.h263
But unfortunately I don't know how to extract elementary stream from a container by using the API of ffmpeg.
Thanks.