I'm trying to develop an android app that allow streaming video from the IP Camera. I finally got the video from the IP camera to display on the app using videoview with mediacontroller but it seem to have some drawbacks. Please help me, thank you!
The functionalities that I need improve:
1) Reduce the playback delay between the camera and video display on the app
2) Record live stream, allowing to view the playback 10 seconds before to 10 seconds after.
Related
I have developed a screen recording application. Now I want the live streaming feature while recording screen. I want this live stream on youtube. From my research I only found how to live stream using camera or how to just open live streaming intent.
Can you guys guide me in the right direction it'll be very helpful
Thank you for your time
You need a media server for live streaming, please read here for detail:
Android(publisher) --RTMP--> SRS/Nginx--RTMP/HLS--> Android(player)
(YouTube)
The media server SRS/Nginx is actually similar to the YouTube-like platform.
About the live streaming publisher or player, you're able to use FFmpeg, OBS or ijkplayer, about more information, please read this link.
I need to stream rtsp-video from IP camera in local network to my android app. It's very easy to use VideoView and play it as url, or SurfaceView and play stream on it with native MediaPlayer. But when I stream that way - I've recieved a 6-second delay when my phone is buffering that video. As I read, there is no way to change buffer size of MediaPlayer. But I saw several apps that stream video from my camera in almost real-time. I've read a lot about this - cause I'm not the first one who encountered this problem - but didn't find any useful info.
Many thanks for any help!
I'm using vlc-android, it works well to play my cameras' rtsp links:
https://github.com/mrmaffen/vlc-android-sdk#get-it-via-maven-central
The delay is about 1 second.
Wowza Media Server running Live Video Streaming.
When I view the live video using RTMP and HLS streaming using wowza examples which has live video players for Flash and IOS. I am able to view video for both but whenever the camera is moved rtmp url shows live video without any delay but HLS stream shows the delay of 10 seconds.
Then I tried running a mobile application using cordova(phonegap) for ios devices. I am using HTML video tag in cordova application and I am able to view live video on IPad simulator using HLS streaming but whenever the camera moves there is a delay of 25 seconds while viewing the live video on IPad.
Can someone please let me know what configuration needs to be done on Wowza Server Side to reduce this delay in Live video streaming for IOS devices?
And also can someone please advice any other player other then HTML Video tag for cordova application?
Three chunks are required by iOS devices for streaming to begin. Each chunk is set to 10 seconds by default.If you use a keyframe interval of 1 frame per second you can lower the cupertinoChunkDurationTarget to 1 second(1000) and get the latency down to closer to 3 seconds.
Please check here for more : http://www.wowza.com/forums/content.php?88-Cupertino-Streaming-segmenter-parameters-%28iOS
I tried running the code is this link
but it is displaying application has stopped, I have been looking for an example to use surfaceView in live video streaming to android and Finally I found this but it is not working
I need live video streaming, since I'm streaming from an ip camera and I would like the streaming to be in real time
can someone tell me what the problem might be or help me with another example
I am in search of some code that streams the live camera recording of android device to the vlc player on system.
that means your android device camera is on and you are capturing video and simultaneously when you start vlc player that video is played on camera
any help will be great and please provide me some code if possible
thanks
http://code.google.com/p/spydroid-ipcamera/
Really, no need to provide details about it. It's open-source and it just works.