I am creating app to send audio to icecast server. Here I connected to icecast server and sending the audio data to server. But Streaming seen on server, but it is not playing in browser Why?
Is there need to encode that data to any format?
If you're having problems with a web-based player you need to be a lot more specific about what you're trying to do and provide some code for us to look at.
You should also confirm the stream is playing correctly in a compatible desktop app first.
Go to http://example.com:8000/admin and verify the mount point you're sending to is listed (which indicates it has a connected source) - if it is there will be an M3U (for streaming MP3) and XSPF (for Ogg) link on the right which you can use to test.
Playback of Icecast streams on Android requires installation of the Just Playlists app or an equivalent.
The problem was i am sending the audio PCM data to server.Now i encoded that PCM data to ogg format to send on server.Now it's working perfectly.
The problem is with the audio encoding. Set encoding data as follows
Capture device inputs
ffmpeg -list_devices true -f dshow -i dummy
Set input from device or other protocol.
Example:
ffplay -f dshow -i audio="Microphone (FaceCam 1000X)"
Then stream
ffmpeg -stats -report -f dshow -i audio="Microphone (FaceCam 1000X)" -c:a flac -compression_level 10 -ar 192000 -legacy_icecast 1 -content_type application/ogg -ice_name "Optional Name" -f ogg icecast://source:password_for_streaming#127.0.0.1:8000/live.ogg
Related
I'm try to send the video stream from gopro to android.
I connect and manage the gopro via http thanks to guide of KonradIt https://github.com/KonradIT/goprowifihack
With ffmpeg android library we managed the streaming
http://writingminds.github.io/ffmpeg-android-java/
but i can see only the information about the frame and i can send the
stream to my computer (attacched to gopro) and see it with vlc.
The ffmpeg command launched from the application android is:
ffmpeg -an -f:v mpegts -i udp://:8554 -an -f:v mpegts udp://10.5.5.101:8555
But we want see the stream directly on app android.
How can do it?
I'm trying to encode a local/static input file (can MP4, for example) into a smaller video file (either by resizing, lesser quality video, etc.) and stream it in parallel (i.e. I can't wait for the encoding process to finish before streaming it back), so it can be played by an Android client (the standard Android video player).
So I've tried using ffmpeg as follows:
ffmpeg -re -i input.mp4 -g 52 -acodec libvo_aacenc -ab 64k -vcodec libx264 -vb 448k -f mp4 -movflags frag_keyframe+empty_moov -
Notice I'm using stdout as the output so I can run ffmpeg and stream its output on the fly
However, such methods (and other similar methods) don't seem to work on Android - it can't simply play it once it receives "non-standard" files (such as a fragmented MP4s) - it seems like the empty moov atom messes it up.
I also tried other container formats, such as 3GPP and WebM.
I'd love to hear any kind of input on this issue...
Thanks
You can specify multiple outputs in ffmpeg, see here http://trac.ffmpeg.org/wiki/Creating%20multiple%20outputs
For Android if newer than 3.0 try HLS as an output
I want to develop an android aplication which allows me to continuously record a video and upload parts of the video to a server without stopping the recording.
It is crucial for the application that I can record up to 60 min without stopping the video.
Initial approach
Application consits of two parts:
MediaRecorder which records a video continuously from the camera.
Cutter/Copy - Part: While the video is recorded I have to take out certain segments and send them to a server.
This part was implemented using http://ffmpeg4android.netcompss.com/
libffmpeg.so. I used their VideoKit Wrapper which allows me to directly run ffmpeg with any params I need.
My Problem
I tried the ffmpeg command with the params
ffmpeg -ss 00:00:03 -i <in-file> -t 00:00:05 -vcodec copy -acodec copy <out-file>
which worked great for me as long as Android's MediaRecorder finished recording.
When I execute the same command, while the MediaRecorder is recording the file, ffmpeg exits with the error message "Operation not permitted".
I think that the error message doesn't mean that android prevents the access to the file. I think that ffmpeg needs the "moov-atoms" to find the proper position in the video.
For that reason I thought of other approaches (which don't need the moov-atom):
Create a rtsp stream with android and access the rtsp stream later. The problem is that to my knowledge android SDK doesn't support the recording to a rtsp stream.
Maybe it is possible to access the camera directly with ffmpeg (/dev/video0 seems to be a video device?!)
I read about webm as an alternative for streaming, maybe android can record webm streams?!
TLDR: Too long didn't read:
I want to access a video file with ffmpeg (libffmpeg.so) while it is recording. Fffmpeg exits with the error message "Operation not permitted"
Goal:
My goal is to record a video (and audio) and take parts of the video while it is still recording and upload them to the server.
Maybe you can help me solve the probelm or you have other ideas on how to approach my problem.
Thanks a lot in advance.
Your real time requirement may lead you away from ffmpeg to webrtc and or to html5.
some resources;
http://dev.w3.org/2011/webrtc/editor/getusermedia.html (section5)
https://github.com/lukeweber/webrtc-jingle-client
ondello .. they have api
rather than going native and trying to get at the video stream or getting at the framebuffer to acquire an xcopy of what is in the video buffer, and to then duplicate the stream an manage a connection (socket or chunked http), you may want to look at api type alternatives....
I have spent quite a while (past week) trying this to little avail. However, what I want seems completely unheard of. So far, I have reviewed recommendations available through google, which include encoding a static file into multiple static files in different formats, creating a playlist that hosts static files in an m3u8 file (files which get added to the playlist as streaming continues).
I have also seen ideas involving rtmp, rtsp etc which are completely out of the question because of their incompatibility.
Ideally, I would have one webpage that would link to the stream (http://server/video.mp4) and/or show it in a webpage (via the video tag). With that in mind, the most likely format would be h264+aac in mp4 container.
Unfortunately, (and probably because the file has no duration metadata) it does not work. I can use a desktop player (such as VLC) to open the stream and play it, but my iPhone and Android both give their respective "Can't be played" messages.
I don't think the problem is caused by the devices' ability to stream, for I have made a streaming shoutcast server work just fine (mp3 only).
Currently, the closest I have become is using the following setup on my win32 machine:
FFMPEG Command:: ffmpeg -f dshow -i video="Logitech Webcam 200":audio="Microphone (Webcam 200)" -b:v 180k -bt 240k -vcodec libx264 -tune zerolatency -profile:v baseline -preset ultrafast -r 10 -strict -2 -acodec aac -ac 2 -ar 48000 -ab 32k -f flv "udp://127.0.0.1:1234"
VLC:: Stream from udp://127.0.0.1:1234 to http:// :8080/video.mp4 (No Transcoding), basically just to convert the UDP stream into an http-accessible stream.
Any hints or suggestions would be warmly welcomed!
Sorry i'm trying to understand your question.
It seems your trying to play mp4 on both android and iphone from your server via http right?
Do you have a streaming server? Or are you simply trying to have the phone pull the file from your server.
If you don't have one, I suggest checking out darwin streaming server (http://justdevelopment.blogspot.com/2009/10/video-streaming-with-android-phone.html).
It will allow you to set up your video to stream with the right encodings needed for each device.
Let me know if that helps
I have the following stream mms://77.238.11.5:8080, you can access it using Windows Mediaplayer.
I don't find any solution to view it on Android devices using MediaPlayer or VideoView, so my idea is to convert is using VLC or FFMPEG to a different format like MP4 or else.
You can use ffmpeg to convert the stream to a file:
ffmpeg -i "mmsh://77.238.11.5:8080/" output.mp4
Option -t duration limits the length of the video, if needed.
They say that VPlayer can play such streams.