I'm trying to bring an Android application to Glass, one which lets the user stream video from their camera. Whenever I try to actually start the streaming, however, I get an error back from the MediaRecorder saying, "start failed: -12". Unfortunately there's not a lot of information about that error on Android, and even less on Glass. Any help?
I found nice example with libstreaming (https://github.com/fyhertz/libstreaming) there:
Google Glass stream video to server
You could try it on your phone first.
I tried simple RTSP server(example 1) as well as streaming through Wowza server, which adds some delay.
Related
I have developed a screen recording application. Now I want the live streaming feature while recording screen. I want this live stream on youtube. From my research I only found how to live stream using camera or how to just open live streaming intent.
Can you guys guide me in the right direction it'll be very helpful
Thank you for your time
You need a media server for live streaming, please read here for detail:
Android(publisher) --RTMP--> SRS/Nginx--RTMP/HLS--> Android(player)
(YouTube)
The media server SRS/Nginx is actually similar to the YouTube-like platform.
About the live streaming publisher or player, you're able to use FFmpeg, OBS or ijkplayer, about more information, please read this link.
I need to stream rtsp-video from IP camera in local network to my android app. It's very easy to use VideoView and play it as url, or SurfaceView and play stream on it with native MediaPlayer. But when I stream that way - I've recieved a 6-second delay when my phone is buffering that video. As I read, there is no way to change buffer size of MediaPlayer. But I saw several apps that stream video from my camera in almost real-time. I've read a lot about this - cause I'm not the first one who encountered this problem - but didn't find any useful info.
Many thanks for any help!
I'm using vlc-android, it works well to play my cameras' rtsp links:
https://github.com/mrmaffen/vlc-android-sdk#get-it-via-maven-central
The delay is about 1 second.
I want to integrate Video Broadcasting and Streaming in my android application through Wowza server. I have tried many different thing like this demo from JavaCV and this one from AndroidHive.
But the only problem with first one is that it uses flv file format to broadcast on Wowza server. It uses FFmpegFrameRecorder to broadcast live video to wowza server (not VOD). To set format of the video broadcasted at server, it uses following method:
recorder.setFormat("flv");
So the main problem with this method is that when I use other formats, it doesn't work. For E.g.:
recorder.setFormat("mpeg"); //or something like mov, etc
The second one was appropriate and had exactly what I needed. But I am facing a weird scenario in that one. It works perfectly fine (can broadcast Video with Audio) for localhost links like rtsp://192.168.1.58:1935/live/myStream but it fails to Broadcast live links like rtsp://54.208.***.***:1935/live/myStream. It shows as playing in wowza server but I cannot see Video or hear Audio from that link.
Please suggest a way to overcome this problem so that I can get Video and Audio at my end while boradcasting
I have used this code to communicate between Wowza and Android for Video Broadcasting part. I completed the part with completed success. The link given shows quite a good explanation on the topic and the configuration of Wowza server that needs to be done in order to make the brodcasting from Android happen.
Try android-ffmpeg library. This will definitely help you!
I am using the following code to stream the videos from my server on android phone:
Intent i = new Intent(Intent.ACTION_VIEW);
i.setData(Uri.parse(videoUrl)); // When I hit this URL in web browser on my PC, it works
startActivity(i);
When I run the code, it asks me to select a player. When I select a player, that player opens up and then nothing happens. I've even tried waiting for minutes, but the video never starts.
My question is, is it really as simple to just put the video on my webserver and on android run the above code to get that video streaming?
Do I need to make any changes on my server? Can anyone please help me with this? I cannot figure out if the problem is with my sever or client side.
It was the problem of VLC player that I was using on my Android Phone for testing. I downloaded the MX Player and tried it with that. Everything is working fine now :-)
That means not every video player on Google Play is mature enough to take a URL and start streaming the video. For that you need a good video player that supports this feature.
Im looking for an example or a guide to how create an app like tunein to have my own list of stations then if i make click in one shows me a player to play de stream i basically use aac+ also have wowza server rtmp links etc but i need some hide maybe or documentation that i can follow to start.
thanks.
I think you should do it connecting to the RTMP address using NetConnection and then start playing the stream. I've never done it with AAC+ but i think (maybe i'm wrong) it's the same thing as with an mp3 stream.
Maybe you could start by reading this http://www.wowza.com/forums/content.php?217-Quick-Start-Guide or more specific -> Real Time Messaging Protocol (RTMP - Adobe Flash Player) http://www.wowza.com/forums/content.php?217#flashRTMP
Also, when i started to (work?) experiment with RTMP, i found very useful articles here http://www.flashrealtime.com/?s=rtmp maybe this should be your starting point.