I can access and view RTSP streams from IP cameras on Android via the VideoView component without problems.
Now I need to play the RTSP stream with a delay (i.e. if I specify a 30 second delay, the playback on screen should be 30 seconds behind the source and the delay needs to be variable though not during playback, only at the point of connecting to the source).
I originally thought this would not be a problem as I could simply change the RTSP buffer duration before connecting to the camera but unfortunately it seems the buffer size is baked into the firmware and cannot be changed in software. Now I have got a horrible feeling that my way forward will be to compile a version of FFMpeg for Android and somehow get the stream data out from the library, buffer it and then render it myself and I have no experience with FFMpeg.
I am unsure how I would now go about solving this problem and any help or pointers in the right direction would be greatly appreciated.
Update:
Sorry I forgot to mention, the RTSP stream is being accessed over WiFi on a LAN so no huge latency issues here from going over the Internet.
Related
I need to stream rtsp-video from IP camera in local network to my android app. It's very easy to use VideoView and play it as url, or SurfaceView and play stream on it with native MediaPlayer. But when I stream that way - I've recieved a 6-second delay when my phone is buffering that video. As I read, there is no way to change buffer size of MediaPlayer. But I saw several apps that stream video from my camera in almost real-time. I've read a lot about this - cause I'm not the first one who encountered this problem - but didn't find any useful info.
Many thanks for any help!
I'm using vlc-android, it works well to play my cameras' rtsp links:
https://github.com/mrmaffen/vlc-android-sdk#get-it-via-maven-central
The delay is about 1 second.
I'm using OpenSL ES in Android to decode and play an mp3 file using an AndroidSimpleBufferQueue. This works fine until you connect a bluetooth headset, at which point performance comes down to a crawl, and playback is very slow and stutters.
Any ideas on how to fix this?
edit:
I seem to have narrowed it down to the buffer queue portion of it. If I just load and play the sound file, it seems to work fine, but with a buffer queue in the middle, performance is terrible. I need the buffer queue to manipulate the data, so any help figuring this out is appreciated.
edit2: It appears that the minimum acceptable buffer size changes when bluetooth is connected. I was using a 4096 buffer, but to get acceptable/uninterrupted audio playback with a bluetooth headset, I had to bump it up to 16384. The latency on that is, however, kind of unacceptable for my needs, so I'm still looking for suggestions on how to improve this.
I am evaluating the possibility of displaying a continuous H.264 live feed (RTSP) on an Android device (2.3+ or even 4.0). I need the delay time (source to display, can assume source encoding to have zero delay) to be within 1 second or so. Wonder if anybody has done this already? What would be a good approach to achieve this?
Thanks in advance
Michael
Maybe you could just give it a try. H264 is supported by android and is even hardware-accelerated on the latest generation of (upper and middle class devices). There is the MediaPlayer class, which can be used to play video files and it also supports the streaming of http and rtsp URIs.
I've rolled my own RTSP server and am currently streaming device to device - however the VideoView seems to be buffering 10 seconds worth of data, regardless of other parameters being set (either within code or SDP file).
Is there any way to reduce this?
There's not. It's embedded in the OS and from what I can tell there aren't any alternatives out there.
I'm currently working on a project which involves the rtsp streaming from an ip camera to an android device.
The phone and ip camera are connected to the same access point / router.
The problem is that the stream has a very big delay, ~ 5 seconds, if you watch a stream from the internet I assume that is ok for the buffer, but for my kind of application is just unacceptable, is not that real time anymore, it's useless... Of course, just to be sure that the camera is not the one to blame, I tested my ip cam stream to my pc in VLC with cache set to 0 and it's perfectly working.
I didn't find any property for the videoview class regarding my problem, so I started looking in the opencore sources hoping to find something in which I can modify and reduce the cache/buffer for the rtsp, I tried to understand how those work, but since those are very complicated I didn't manage to do that.
Now I'm stuck at this point in my project and can't see to find a way out... and the applications deadline is coming very fast. :|
Everyone who has any idea in getting this thing resolved please help me, because I'm kind of desperate right now.
As it's stated here, buffer size for standard VideoView is hardcoded into firmware. So, you will have to use a custom library like Vitamio to have some control over buffer size (more on that particular matter in their tutorial)
Set buffer size as 1000 before you start playing
mVideoView.setBufferSize(1000);