I want to stream a live webcam video and display it on android device with the lowest latency possible. I am streaming video from a computer over rtsp protocol and I am able to watch the stream with ~150ms latency on the second computer using ffplay -fflags nobuffer rtsp://xxx.xxx.xxx.xxx:port/test.sdp command.
So far I have successfully compiled ffmpeg4android library. Unfortunately this library uses NDK which I am not familiar with. All I want to do is to be able to invoke the same command and display the video on the android SurfaceView.
How can I call for such command?
Related
I am developing VOIP app for IOS/Android.
First time, I used H264 codec for video.
When I make video call, there is exist black video, as I see the log, there are many dropped packet so that first frame can not decode.
So I built IOS linphone library with x264 and it works well.
There is fast video after accept video call.
But still I can see the black screen on android because android used H264 yet.
I want to use x264 video codec on android linphone.
I used latest android linphone source code.
Here is my option.
./prepare.py -DENABLE_GPL_THIRD_PARTIES=NO -DENABLE_NON_FREE_CODECS=ON -DENABLE_VPX=NO -DENABLE_VCARD=NO -DENABLE_DOC=NO -DENABLE_OPENH264=ON -DENABLE_X264=ON
For a school project my partner and I are trying to push video from an android tablet (Nexus 7) to a server (as an ip webcam), pull from the server into OpenCV 2.4.6, process that, send it back to a server, and have the tablet display the feed in real or near-real time.
We arent using opencv for android because the goal is for a remote user to decide how to process the video (i.e. selecting a template to match or something from the stream).
Our question is this: we have managed to get the android webcam stream onto a server as an h264 rtsp stream. All the documentation for how to pull an rtsp stream is either outdated, really confusing, or altogether non-existent. We tried using a VideoCapture object and then tried using cvCreateFileCapture but neither seem to be working. How do we do this?
There is an open source project that does precisely this combination.
Android generating the content from front or rear camera
RTSP protocol for streaming
h264
https://code.google.com/p/spydroid-ipcamera/
I want to develop an android aplication which allows me to continuously record a video and upload parts of the video to a server without stopping the recording.
It is crucial for the application that I can record up to 60 min without stopping the video.
Initial approach
Application consits of two parts:
MediaRecorder which records a video continuously from the camera.
Cutter/Copy - Part: While the video is recorded I have to take out certain segments and send them to a server.
This part was implemented using http://ffmpeg4android.netcompss.com/
libffmpeg.so. I used their VideoKit Wrapper which allows me to directly run ffmpeg with any params I need.
My Problem
I tried the ffmpeg command with the params
ffmpeg -ss 00:00:03 -i <in-file> -t 00:00:05 -vcodec copy -acodec copy <out-file>
which worked great for me as long as Android's MediaRecorder finished recording.
When I execute the same command, while the MediaRecorder is recording the file, ffmpeg exits with the error message "Operation not permitted".
I think that the error message doesn't mean that android prevents the access to the file. I think that ffmpeg needs the "moov-atoms" to find the proper position in the video.
For that reason I thought of other approaches (which don't need the moov-atom):
Create a rtsp stream with android and access the rtsp stream later. The problem is that to my knowledge android SDK doesn't support the recording to a rtsp stream.
Maybe it is possible to access the camera directly with ffmpeg (/dev/video0 seems to be a video device?!)
I read about webm as an alternative for streaming, maybe android can record webm streams?!
TLDR: Too long didn't read:
I want to access a video file with ffmpeg (libffmpeg.so) while it is recording. Fffmpeg exits with the error message "Operation not permitted"
Goal:
My goal is to record a video (and audio) and take parts of the video while it is still recording and upload them to the server.
Maybe you can help me solve the probelm or you have other ideas on how to approach my problem.
Thanks a lot in advance.
Your real time requirement may lead you away from ffmpeg to webrtc and or to html5.
some resources;
http://dev.w3.org/2011/webrtc/editor/getusermedia.html (section5)
https://github.com/lukeweber/webrtc-jingle-client
ondello .. they have api
rather than going native and trying to get at the video stream or getting at the framebuffer to acquire an xcopy of what is in the video buffer, and to then duplicate the stream an manage a connection (socket or chunked http), you may want to look at api type alternatives....
I'm trying to build an Android application that takes in an RTSP stream and displays the audio/video. I've been using VideoView and still getting a delay between 3 and 10 seconds from real time. I need this delay to be under 3 seconds.
On a PC, I can see the same RTSP stream using VLC with only a 1-2 second delay. How can I replicate this on Android? Even when I use other apps like MoboPlayer/RockPlayer, the delay is still 3 to 10 seconds. (If it matters, I'm connecting to the RTSP stream wirelessly on both PC and Android)
I've started looking into using FFmpeg for Android as well as the Gstreamer SDK for Android as alternatives to MediaPlayer, but they're both hard to work with for a novice like myself and I'm running into multiple problems.
Any and all information would be greatly appreciated!
First I think delay is because of initial buffer size which is uncontrollable on Android hardware MediaPlayer. This can be fixed by using custom SW media player as FFMpeg (AVconv now) or GStreamer. But it will cost you performance (depend on video stream compression,resolution, fps) and device battery life.
And of course native development using FFMpeg or similar C/C++ framework is complex and require a lot of time and experience.
Another option: you can try new Android 4.1 Java API to access video and audio decoder. And build your own RTSP player using this API and some code to load video stream from RTSP server (probably some Java library already exist).
Android 4.1 Multimedia
I'm trying to install a Wowza server on my Linux machine to enable the RTSP streaming for my Android application.
On Android client side what sort of changes do I need to make in my application? I'm using Videoview to simply play a video file stored locally.
Now I want to get the video content get streamed through the server that I've installed. If necessary I can move to any other streaming server as right now I'm doing a research on streaming servers.
For rtsp streaming you can also try following servers:
Darwin Streaming Server - linux package is available
Windows Media Services - can be installed on Windows Server Trial
VLC - standalone application
For testing purposes of your application i would also recommend you to use existing mobile video services like:
m.youtube.tv
m.wp.tv
You can extract video links from those sites and use them to test your application.
Try to follow Android ApiDemos, you can find video streaming player example at:
...android-sdk-windows\platforms\android-x\samples\ApiDemos\src\com\example\android\apis\media\MediaPlayerDemo_Video.java
VLC+Android Owns.
I used the following one-liner to stream video of our kittens to our cell phones.
We used the launchRTSP free app to leverage the built-in RTSP viewing capabilities of Android, to access the URL over the internet.
You may want to tweak the frame rate and such. As shown below, it's perfect for webcam streaming.
vlc -vvvvvvvvvvvvvvvvvvvvvvv -I dummy v4l2://:vdev=/dev/video:width=640:height=480:fps=2 --sout "#transcode{vcodec=mp4v,fps=5,vb=800,acodec=mpga,samplerate=8000,ab=64,deinterlace,channels=1,sfilter='mosaic:marq{marquee=%m-%d-%Y_%H:%M:%S,size=16,color=16711680,position=5,opacity=64}'}:rtp{sdp=rtsp://0.0.0.0:5858/kittens.sdp}"
WCS4 server can deliver WebRTC stream as RTSP.
So you can send WebRTC live stream from Android or desktop Chrome/FF browser and then connnect to this stream via VLC or Android by RTSP.