My new surveillance camera just arrived, so I'm trying to write an app to live stream the video from it.
Since it came with basically no documentation, I installed the 'onvifer' android app which allows you to browse the camera's capabilities. This app works fine - gets the video and allows PTZ controls, etc. It reports the streaming url as:
rtsp://192.1.0.193:554/mpeg4
I tested the stream in the VLC windows client, and it's able to stream video from that URL as well. This makes me comfortable that the network is working OK.
The camera states the feed will be 1920x1080; VLC confirms this.
The basic code in my activity:
VideoView videoView = (VideoView)this.findViewById(R.id.VideoView);
videoView.setVideoURI(Uri.parse("rtsp://192.1.0.193:554/mpeg4"));
videoView.requestFocus();
videoView.start();
I've also given the app INTERNET permissions in AndroidManifest.xml, disabled authentication on the camera, and am running on a real device (not the emulator).
When I run the app, LogCat shows this immediately:
setDataSource IOException happend :
java.io.FileNotFoundException: No content provider: rtsp://192.1.0.193:554/mpeg4
at android.content.ContentResolver.openTypedAssetFileDescriptor (ContentResolver.java).
About 15 seconds later, the app shows a "Can't play this video" modal dialog box and this is added to LogCat:
MediaPlayer error (100, 0)
AudioSystem AudioFlinger server died!
MediaPlayer error (100, 0)
VideoView Error: 100,0
I've googled everything I can think of, but haven't found anything useful.
Any thoughts?
wild-ass-guess on your logcat and the RC=100... No SDP file or no equivalent for RTSP of the 'moov atom' block required to negotiate details of the stream /container/ codec/ format... You can get the AOSP code for mediaPlayer/videoView and grep the RC value in the source.
RTSP is gnarly to debug ( note the tools links ) and not assured to run inside a NAT'd network due to UDP issues. So, to get better result, you may have to look into forcing your config to do data channel on TCP an not UDP. Or it could be other issues , of which there are many.
If you really want to investigate, some possible tools below:
Use command line and CURL client to request your stream:
Android - Java RTSP Session Mgmt package on Git
Protocol dumps for CLI RTSP sessions to Youtube RTSP/SDP streams
To pursue the issue, you may need to get into the weeds with debug tools that track details of the protocol negotiation that preceeds the MediaPlayer actually starting play on the stream. That would include learning the RFP and the protocol details.
videoView.setVideoURI(“rtsp://192.1.0.193:554/mpeg4”);
Try your app on another phone.
You may find the problem is about the mobile device.
Try this
path:"rtsp://218.204.223.237:554/mobile/1/4C024DFE77DC717D/onnuvesj43xj7t26.sdp".
See whether the code has something wrong.
Related
I need to stream audio from external bluetooth device and video from camera to wowza server so that I can then access the live stream through a web app.
I've been able to successfully send other streams to Wowza using the GOCOder library, but as far as I can tell, this library only sends streams that come from the device's camera and mic.
Does anyone have a good suggesting for implementing this?
In the GoCoder Android SDK, the setAudioSource method of WZAudioSource allows you to specify an audio input source other than the default. Here's the relevant API doc for this method:
public void setAudioSource(int audioSource)
Sets the actively configured input device for capturing audio.
Parameters:
audioSource - An identifier for the active audio source. Possible values are those listed at MediaRecorder.AudioSource. The default value is MediaRecorder.AudioSource.CAMCORDER. Note that setting this while audio is actively being captured will have no effect until a new capture session is started. Setting this to an invalid value will cause an error to occur at session begin.
Following this tutorial I have successfully created a GLSurfaceView that displays my local video in my Android app. I am using Pristine's gradle build scripts to use native code for webrtc. The web app works as expected in Chrome.
I have established a connection to my nodejs server via web sockets to join a pre-existing conversation. Kurento is being used to deal with rooms. I believe I need to create an SDP Offer to begin sending and receiving videos between peers at this point. (To begin I simply want the video from the Android device to appear on the web interface.)
However if I create a PeerConnection and add my local media stream (created with PeerConnectionFactory.createLocalMediaStream) and then call createOffer() it fails.
The SDPObserver that listens to my connection gets its onCreateFailure called with the message "CreateOffer called with invalid media streams."
Looking at the C code it appears that the streams do not have unique IDs (despite the fact I have only created one stream).
I've been trying to figure this out for a day now and don't seem to be making any progress. Any suggestions?
Thanks in advance!
We have a Android radio app which plays various live radio streams. It works pretty well with Google Chromecast as log the stream is a "normal" mp3 stream. The playback on Chromecast Styled Media Receiver is NOT working with SHOUTcast streams i.e http://46.105.118.14:13500.
After mRemoteMediaPlayer.load(...) I see in my Android LogCat a result statuscode = 1.
In Chrome debug console I see the following Load metadata error:
[673.080s] [cast.receiver.MediaManager] Load metadata error
cast_receiver.js:18
ib cast_receiver.js:18
gb.Gb cast_receiver.js:18
B.log cast_receiver.js:13
E cast_receiver.js:15
Z.pa cast_receiver.js:71
Eb cast_receiver.js:23
Cb cast_receiver.js:24
(anonymous function) cast_receiver.js:21
Is it possible to play SHOUTcast live streams with Google Chromecast (Styled Media Receiver or Custom Receiver)? If yes, can you give me some hints or point me to an example?
Append a /; after the port of the stream url, so you get to the stream data i.e http://46.105.118.14:13500/;
I use the localcast android app with this trick to make my chromecast play SHOUTcast Radio.
I am using Android 4.1.2 on Galaxy S3. Currently android mediaplayer always tries RTSP UDP (RTP/AVP/UDP) method to connect with RTSP server.
If Android MediaPlayer does not receive the data on its UDP ports..it timesout and then tries RTSP TCP interleaved (RTP/AVP/TCP). This is fine but it introduces delay of 10 secs or so. I want to avoid this delay, and force Android MediaPlayer to always use RTSP TCP interleaved (RTP/AVP/TCP) for all or specific URL's.
I tried suggestion given in Here to send 461 or 400 error response code to SETUP request. But it seems mediaplayer does not care about the response, and sends SETUP command for both tracks, and then just hangs the connection.
How can I resolve this issue ?
I'm using VLC instead of the native one.
Read the Living555 source code pls.
You can specify the Transport: RAW/RAW/UDP field in the SETUP request to choose what protocal to use.
i might be wrong but AFAIK android mediaplayer does not support RTP over TCP.
What I want is to broadcast an android camera video to remote locations, for anyone to watch that video on their mobile or website.
I've been succesful to unicast it to the vlc player on my pc.
I tried red5 server, Adobe media server, ffmpeg server but all in vail.
Each of them was only able to broadcast video from a prerecorded file but not from any live stream.
Can any one suggest me what i do.
I read (I think it was even on stackoverflow) that you can provide the MediaRecorder with a FileHandle of a TCP-Connection. Then you can listen to that connection, read the data, packetize it and resend it as a RTSP/RTP-Stream.
If I happen to find the original post, I'll reference it here.
EDIT:
The original Post was: Streaming Video From Android
And the part about the Filedescriptor is from: http://www.mattakis.com/blog/kisg/20090708/broadcasting-video-with-android-without-writing-to-the-file-system
Just in case, I cite the according example from the blog:
String hostname = "your.host.name";
int port = 1234;
Socket socket = new Socket(InetAddress.getByName(hostname), port);
ParcelFileDescriptor pfd = ParcelFileDescriptor.fromSocket(socket);
MediaRecorder recorder = new MediaRecorder(); // Additional MediaRecorder setup (output format ... etc.) omitted
recorder.setOutputFile(pfd.getFileDescriptor());
recorder.prepare();
recorder.start();
However this only sends the Video File Data over the wire. You can save it and then play it back. But as mentioned, it is not a stream, yet.
UPDATE:
You do not even have to use a TCP Socket for the first step. I just tripped over "LocalSocket"(1), that also gets you a FileHandle to feed the MediaRecorder. Those Local sockets are "AF_LOCAL/UNIX domain stream socket"s. See http://developer.android.com/reference/android/net/LocalSocket.html
I have not tried all the above myself as of today, but will pretty soon. So maybe I can be of more help in the near future :)
(1) LocalSocket is not usable on newer Android versions for security reasons! See Update from 2015-11-25.
UPDATE 2:
Just saw in the Android Sources the "OUTPUT_FORMAT_RTP_AVP". But it is hidden :( So I guess it will be available in future API versions of Android.
https://github.com/android/platform_frameworks_base/blob/master/media/java/android/media/MediaRecorder.java Line 219:
public static final int OUTPUT_FORMAT_RTP_AVP = 7;
I have not tried just tricking the hide by providing a hardcoded 7 ... If anybody does, please leave a comment here!
UPDATE 2015-11-25
I just ran into libstreaming: https://github.com/fyhertz/libstreaming
I did not look into it too deeply, but it seems there is a lot to be learned about streaming from Android from this project (if not only using it). I read there that the LocalSocket solution is invalid for newer Android versions :( But they present an alternative: ParcelFileDescriptor.