I am using Android 4.1.2 on Galaxy S3. Currently android mediaplayer always tries RTSP UDP (RTP/AVP/UDP) method to connect with RTSP server.
If Android MediaPlayer does not receive the data on its UDP ports..it timesout and then tries RTSP TCP interleaved (RTP/AVP/TCP). This is fine but it introduces delay of 10 secs or so. I want to avoid this delay, and force Android MediaPlayer to always use RTSP TCP interleaved (RTP/AVP/TCP) for all or specific URL's.
I tried suggestion given in Here to send 461 or 400 error response code to SETUP request. But it seems mediaplayer does not care about the response, and sends SETUP command for both tracks, and then just hangs the connection.
How can I resolve this issue ?
I'm using VLC instead of the native one.
Read the Living555 source code pls.
You can specify the Transport: RAW/RAW/UDP field in the SETUP request to choose what protocal to use.
i might be wrong but AFAIK android mediaplayer does not support RTP over TCP.
Related
I am currently using an app that uses the method exemplified on libstreaming-example-1 (libstreaming) to stream the camera from an Android Device to an Ubuntu Server (using openCV and libVLC). This way, my Android device acts like a Server and waits for the Client (Ubuntu Server) to send the play signal over RTSP and then start the streaming over UDP.
The problem I am facing with the streaming is that I am getting a delay of approximately 1.1s during the transmission and I want to get it down to 150ms maximum.
I tried to implement the libstreaming-example-2 of libstreaming-examples, but I couldn't I don't have access to a detailed documentation and I couldn't figure out how to get the right signal to display the streaming on my server. Other than that, I was trying to see what I can do with the example 1 in order to get it down, but nothing new until now.
PS: I am using a LAN, so network/bandwidth is not the problem.
Here come the questions:
Which way is the best to get the lowest latency possible while
streaming video from the camera?
How can I implement example-2?
Is example-2 method of streaming better to get the latency down to
150ms?
Is this latency related to the decompression of the video on
the server side? (No frames are dropped, FPS: 30)
Thank you!
had same issue as you with huge stream delay (around 1.5 - 1.6 sec)
My setup is Android device which streams its camera over RTSP using libStreaming, receiving side is Android device using libVlc as media player. Now I found a solution to decrease delay to 250-300 ms. It was achieved by setting up libVlc with following parameters.
mLibvlc = new LibVLC();
mLibvlc.setVout(LibVLC.VOUT_ANDROID_WINDOW);
mLibvlc.setDevHardwareDecoder(LibVLC.DEV_HW_DECODER_AUTOMATIC);
mLibvlc.setHardwareAcceleration(LibVLC.HW_ACCELERATION_DISABLED);
mLibvlc.setNetworkCaching(150);
mLibvlc.setFrameSkip(true);
mLibvlc.setChroma("YV12");
restartPlayer();
private void restartPlayer() {
if (mLibvlc != null) {
try {
mLibvlc.destroy();
mLibvlc.init(this);
} catch (LibVlcException lve) {
throw new IllegalStateException("LibVLC initialisation failed: " + LibVlcUtil.getErrorMsg());
}
}
}
You can play with setNetworkCaching(int networkCaching) to customize a bit delay
Please let me know if it was helpful for you or you found better solution with this or another environment.
My new surveillance camera just arrived, so I'm trying to write an app to live stream the video from it.
Since it came with basically no documentation, I installed the 'onvifer' android app which allows you to browse the camera's capabilities. This app works fine - gets the video and allows PTZ controls, etc. It reports the streaming url as:
rtsp://192.1.0.193:554/mpeg4
I tested the stream in the VLC windows client, and it's able to stream video from that URL as well. This makes me comfortable that the network is working OK.
The camera states the feed will be 1920x1080; VLC confirms this.
The basic code in my activity:
VideoView videoView = (VideoView)this.findViewById(R.id.VideoView);
videoView.setVideoURI(Uri.parse("rtsp://192.1.0.193:554/mpeg4"));
videoView.requestFocus();
videoView.start();
I've also given the app INTERNET permissions in AndroidManifest.xml, disabled authentication on the camera, and am running on a real device (not the emulator).
When I run the app, LogCat shows this immediately:
setDataSource IOException happend :
java.io.FileNotFoundException: No content provider: rtsp://192.1.0.193:554/mpeg4
at android.content.ContentResolver.openTypedAssetFileDescriptor (ContentResolver.java).
About 15 seconds later, the app shows a "Can't play this video" modal dialog box and this is added to LogCat:
MediaPlayer error (100, 0)
AudioSystem AudioFlinger server died!
MediaPlayer error (100, 0)
VideoView Error: 100,0
I've googled everything I can think of, but haven't found anything useful.
Any thoughts?
wild-ass-guess on your logcat and the RC=100... No SDP file or no equivalent for RTSP of the 'moov atom' block required to negotiate details of the stream /container/ codec/ format... You can get the AOSP code for mediaPlayer/videoView and grep the RC value in the source.
RTSP is gnarly to debug ( note the tools links ) and not assured to run inside a NAT'd network due to UDP issues. So, to get better result, you may have to look into forcing your config to do data channel on TCP an not UDP. Or it could be other issues , of which there are many.
If you really want to investigate, some possible tools below:
Use command line and CURL client to request your stream:
Android - Java RTSP Session Mgmt package on Git
Protocol dumps for CLI RTSP sessions to Youtube RTSP/SDP streams
To pursue the issue, you may need to get into the weeds with debug tools that track details of the protocol negotiation that preceeds the MediaPlayer actually starting play on the stream. That would include learning the RFP and the protocol details.
videoView.setVideoURI(“rtsp://192.1.0.193:554/mpeg4”);
Try your app on another phone.
You may find the problem is about the mobile device.
Try this
path:"rtsp://218.204.223.237:554/mobile/1/4C024DFE77DC717D/onnuvesj43xj7t26.sdp".
See whether the code has something wrong.
I want to create an Android application that is capable of receiving an audio stream. I thought of using the A2DP profile, but is seems as if Android doesn't support A2DP sink. Looks like there are a lot of people that's searching for a solution for this problem. But what about receiving an ordinary bit stream, and then convert the data into audio in the application? I was thinking of receiving an PCM or Mp3 data stream via the RFCOMM (SPP Bluetooth profile), and then play it using AudioTrack.
First, how do I receive a bit stream on my Android phone via the RFCOMM? And is it possible to receive a bit stream via RFCOMM as a PCM or Mp3 stream?
Second, if it isn't possible to receive a bit stream via RFCOMM as a PCM or Mp3 stream, how do I convert the received bit stream into audio?
Third, how do I convert the received data into audio AND play the audio simultaneously, in "real time"? Can I just use onDataReceived?
To be clear, I'm not interested of using the A2DP profile! I want to stream the data via the RFCOMM (SPP Bluetooth profile). The received data stream will be in PCM or Mp3. I thought of writing my own app, but if anyone knows of an app to solve this I'd be glad to hear about it! I'm using Android 2.3 Gingerbread.
/Johnny
No. Trying to write an Android application that handles this will not be the solution. At least if you want to use A2DP Sink role.
The fact is that Android, as you mentioned it, does not implement the API calls to BlueZ (the bluetooth stack Android uses till Jelly Bean 4.1) regarding A2DP sink capabilities. You have to implement them yourself. I will try to guide you, as I was also interested in doing this my self in the near past.
Your bluetooth-enabled Android device is advertising itself as an A2DP source device by default. You have to change this first, so nearby devices may recognize your device as a sink. To do this, you must modify the audio.conf file (usally located in /etc/bluetooth/) and make sure the Enable key exists and the value Source is attached to this key, so you will get something like :
Enable=Source
Reboot, nearby devices should now recognize your device as an A2DP sink.
Now you will have to interact with BlueZ to react appropriately when an A2DP source device will start to stream audio to your phone.
Android and BlueZ are talking to each other via D-BUS. In fact, Android connects to the DBUS_SYSTEM channel and listens to every BlueZ advertisement, such as events, file descriptors ...
I remember having successfully bound my self using a native application to this d-bus channel and got access to the various events BlueZ was posting. This is relatively easy to achieve using as reference, the BlueZ API available here. If you go this way, you will have to build a native application (C/C++) and compile it for your platform. You must be able to do this using the Android NDK.
If you find it difficult to use D-BUS, you can try this Java library I just found that handles the communication to D-BUS for you : http://jbluez.sourceforge.net/. I have never used it but it is worth a try in my opinion.
What you really have to do is find out when an A2DP source device is paired to your phone and when he starts to stream music. You can retrieve these events through D-BUS. Once somebody will try to stream music, you need to tell BlueZ that your native application is going to handle it. There is a pretty good document that explains the flow of events that you should handle to do this. This document is accessible here. The part you're interested in comes on page 7. The sink application in the given example is PulseAudio but it could be your application as well.
BlueZ will forward you a UNIX socket when you will call the org.bluez.MediaTransport.Acquire method. Reading on this socket will give you the data that are currently streamed by the remote device. But I remember having been told by a guy working on the BlueZ stack that the data read on this socket are not PCM pure audio, but encoded audio content instead. The data are generally encoded in a format called SBC (Low Complexity Subband Coding).
Decoding SBC is not very difficult, you can find a decoder right here.
The ultimate step would be to forward the PCM audio to your speakers.
To prevent you from getting stuck and in order to test your application in an easier manner, you can use the d-bus binary that should be available on your Android system. He is located in /system/bin.
Quick tests you can make before doing anything of the above might be :
Get Devices list :
dbus-send --system --dest=org.bluez --print-reply /
org.bluez.Manager.GetProperties
This returns an array of adapters with their paths. Once you have these path(s) you can retrieve the list of all the bluetooth devices paired with your adapter(s).
Get paired devices :
dbus-send --system --print-reply --dest=org.bluez
/org/bluez/{pid}/hci0 org.bluez.Adapter.GetProperties
This gives you the list of paired devices whithin the Devices array field.
Once you have the list of devices paired to your Bluetooth Adapter, you can know if it is connected to the AudioSource interface.
Get the devices connected to the AudioSource interface :
dbus-send --system --print-reply --dest=org.bluez
/org/bluez/{pid}/hci0/dev_XX_XX_XX_XX_XX_XX
org.bluez.AudioSource.GetProperties
org.bluez.Manager.GetProperties
Hope this helps.
Another work around is using HandsFreeProfile.
in Android, BluetoothHeadset is working on that.
Wait until status changed to BluetoothHeadset.STATE_AUDIO_CONNECTED.
then you can record audio from bluetooth headset.
mMediaRecorder = new MediaRecorder();
mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
mMediaRecorder.setOutputFile(mFilename);
mMediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
try {
mMediaRecorder.prepare();
} catch (IllegalStateException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
mMediaRecorder.start();
[Irrelevant but works] This hack serves only mp3 streaming via WIFI hotspot (I use it in my car which has only AUX input):
Install the app AirSong,
Turn on wifi hotspot,
Connect the other device to that hotspot,
Access 192.168.43.1:8088 from the device's browser and you are on.
(wondering why "192.168.43.1" only? because thats the default gateway of any device connected to Android Hotspot)
audio.conf seems to be missing in Android 4.2.2?
To receive pcm audio stream via rfcomm , you can use code flow as a hint explained (Reading Audio file in C and forwarding over bluetooth to play in Android Audio track) , with a change . change freq used while initializing from 44100 to 22050
AudioTrack track = new AudioTrack(AudioManager.STREAM_MUSIC,22050,AudioFormat.CHANNEL_OUT_MONO,AudioFormat.ENCODING_PCM_8BIT,10000, AudioTrack.MODE_STREAM);
note:This streaming still consists some noise but your
"receiving an PCM data stream via the RFCOMM (SPP Bluetooth profile), and then play it using AudioTrack."
will work.
I have a audio streaming app, which runs a local proxy server. The local proxy server makes a http connection to a internet streaming source, gets and buffers locally the streaming data.
Then, inside in the app, I use MediaPlayer to connect to the local proxy server, using the method
mediaPlayer.setDataSource(...); // the url of the local proxy server
Everything was fine (with plenty of Android devices and different OS versions - 1.5...4.0), until Nexus 7 release.
In Nexus 7, the media player refuses to play the source from the local proxy server.
When I took a look at the logs, seems like the MediaPlayer uses range requests internally.
My local proxy server doesn't handle that. It returns HTTP/1.0 200 OK and the data.
However, the media player doesn't like that and throws an exception:
Caused by: libcore.io.ErrnoException
?:??: W/?(?): [ 07-18 00:08:35.333 4962: 5149 E/radiobee ]
?:??: W/?(?): : sendto failed: ECONNRESET (Connection reset by peer)
?:??: W/?(?): at libcore.io.Posix.sendtoBytes(Native Method)
?:??: W/?(?): at libcore.io.Posix.sendto(Posix.java:146)
?:??: W/?(?): at libcore.io.BlockGuardOs.sendto(BlockGuardOs.java:
?:??: W/?(?): at libcore.io.IoBridge.sendto(IoBridge.java:473)
We requested a content range, but server didn't support that. (responded with 200)
According to the http specs, if the server responds with HTTP/1.0 instead 1.1, the client must not fire a range request (1.0 doesn't support that anyway),
also if the server doesn't support the range request, it should be fine, if it responds with 200 OK (and this is what I'm doing), but the MediaPlayer implementation on Nexus 7 doesn't like that.
I took a look at this thread :
HTTP: How should I respond to "Range: bytes=" when Range is unsupported?
,where they claim that the response with 200 OK must be good enough, but unfortunately it doesn't help.
I'm not sure if this is a problem with Jelly Bean, or a problem with Nexus 7 implementation specifically, but it's still a problem for me which I have to resolve.
Again, there are NO range requests on plenty other Android devices, using the same app. For some reason these range requests are happening now on Nexus 7. (It may happen on other Android devices as well, but again, never happened to me so far).
Any possible way to disable the range requests for MediaPlayer?
If there are none, can anybody suggest a quick fix for my proxy server logic (what exactly it has to return, if it receive this range request?), without changing my other logic, if possible?
Seems like maybe I have to return something like "HTTP/1.0 206 OK\r\nPartial Content\r\n\r\n", but probably there should be some value at the end of the Partial Content - not sure what is should be this one.
Your help would be appreciated.
Thanks..
We've finally solved this in a clean way. In our case we have full control over the streaming server, but i guess you could do this with a local proxy as well. Since Build.VERSION_CODES.ICE_CREAM_SANDWICH it's possible to set headers for the MediaPlayer object. As our app enables the user to seek inside the audio stream, we've implemented this Range header on the client. If the server responds with the proper headers, the MediaPlayer will not try multimple times to request the stream.
That's what our server headers look like now for the android client:
Content-Type: audio/mpeg
Accept-Ranges: bytes
Content-Length: XXXX
Content-Range: bytes XXX-XXX/XXX
Status: 206
The important part is the 206 status code (partial content). If you don't send this header, the android client will try to re-request the source, no matter what.
If your player doesn't allow seeking in the stream, you could simply always set the Range header to 0-some arbitrary large number.
I work on a large scale audio streaming app (that uses a local host HTTP proxy to stream audio to MediaPlayer) and I ran into this issue as soon as I got a JellyBean device in my hands at Google I/O 2012.
When quality testing our application on different devices (and with information received from our automated crash logs and user submitted logs), we noticed that certain MediaPlayer implementations behaved in what I would call an erratic (and sometimes downright psychotic) manner.
Without going into too much detail, this is what I saw: some implementations would make multiple requests (some times 5+) for the same URL. These requests were all slightly different from each other in that each one was for a different byte range (usually for the first or last 128 bytes). The conclusion was that the MediaPlayer was trying to find embedded metadata, then would after some point it would give up and just make a regular non-range request.
That is not what the stock JellyBean MediaPlayer implementation is doing, it just an example of the whacky-ness and general fragmentation of the media framework on Android. However, the solution to the above situation was also the solution to the JellyBean problem, and that is:
Have your local proxy respond with chunked encoding
This replaces the Content-Length header with a Tranfer-Encoding: chunked header. This means the requesting client will not know the total length of the resource and thus cannot make a range-request, it just has to deal with the chunks as they are received.
Like I said this is a hack, but it works. It is not without its side effects: media player buffer progress will be incorrect since it doesn't know the length of the audio (is one of them), to get around that you will have to use your own buffering computation (you are streaming from somewhere through your proxy to MediaPlayer, right? So you will know the total length).
When you seek or skip or the connection is lost and MediaPlayer keeps reconnecting to the proxy server, you must send this response with Status 206 after you get the request and range(int) from the client.
String headers += "HTTP/1.0 206 OK\r\n";
headers += "Content-Type: audio/mpeg\r\n";
headers += "Accept-Ranges: bytes\r\n";
headers += "Content-Length: " + (fileSize-range) + "\r\n";
headers += "Content-Range: bytes "+range + "-" + fileSize + "/*\r\n";
headers += "\r\n";
I recently compiled ffmpeg and live555 for android, and built my own media client wrapper. The whole system works perfectly in all other systems (windows and linux), but not in android, just no UDP packets could be ever received. The RTSP communication works fine, which uses TCP connection. The session starts successfully, and keeps running in server. After searching for the similar topics, I see it seems that I have to acquire a multicasting permission with wifi at first. So I did:
- put permissions in AndroidManifest.xml
<uses-permission android:name="android.permission.CHANGE_NETWORK_STATE""/>
<uses-permission android:name="android.permission.INTERNET"/>
<uses-permission android:name="android.permission.CHANGE_WIFI_MULTICAST_STATE"/>
- put following java codes in android Activity::onCreate()
WifiManager wm = (WifiManager)getSystemService(Context.WIFI_SERVICE);
if( wm != null ) {
mMCLock = wm.createMulticastLock( TAG );
mMCLock.acquire();
}
But it still doesn't work, the results are all the same in Emulator, in Galaxy S2 Phone and
in Galaxy Tab 10.1. Even I deactivate the live555 module, and just use ffmpeg ( ffmpeg has also its build-in rtsp client, but not as stable as live555, therefor, I ported live555 into android). The results are the same, rtsp ok, rtp not, where rtp uses udp as underlying carrier.
In DDMS is an error registered:
Address Family not supported by protocol
I think, the problem is that the UDP port is still blocked. Maybe getting multicastlock in
java is not enough for native code running in user kernel of android.
Does anyone has idea?
Steven
The UDP Problem in meinem RTSP client has been solved, there is nothing to do with permission and multicast lock. It is the bug inside android stl library implementation, provided in android-ndk-r7 and android-ndk-r8 both. Anyone wants to use gnu-libstdc++.so has to keep in mind: don't use string, especially string::c_str(), it leaves danger pointer in your stack, and damages everything. After I threw out every thing to do with stl, everything works fine, tcp and udp. A little off-topic: Inside the live555, there are at least 20 bugs, and the most fatal errors are: they used unblocked rtp over tcp, therefore, most packets are get lost before they reach the network interface, and at rtsp client, the rtp/tcp socket will never get the packets which are lost in the network interface, and then a rtsp session enters into endless receiving loop, it hangs.
I faced the same problem.
I think, in your bind you are biding to an IP, use htonl(INADDRY_ANY) as s_addr.
not sure whether this helps your cause, but it seems to solve my problem.