What I want is to broadcast an android camera video to remote locations, for anyone to watch that video on their mobile or website.
I've been succesful to unicast it to the vlc player on my pc.
I tried red5 server, Adobe media server, ffmpeg server but all in vail.
Each of them was only able to broadcast video from a prerecorded file but not from any live stream.
Can any one suggest me what i do.
I read (I think it was even on stackoverflow) that you can provide the MediaRecorder with a FileHandle of a TCP-Connection. Then you can listen to that connection, read the data, packetize it and resend it as a RTSP/RTP-Stream.
If I happen to find the original post, I'll reference it here.
EDIT:
The original Post was: Streaming Video From Android
And the part about the Filedescriptor is from: http://www.mattakis.com/blog/kisg/20090708/broadcasting-video-with-android-without-writing-to-the-file-system
Just in case, I cite the according example from the blog:
String hostname = "your.host.name";
int port = 1234;
Socket socket = new Socket(InetAddress.getByName(hostname), port);
ParcelFileDescriptor pfd = ParcelFileDescriptor.fromSocket(socket);
MediaRecorder recorder = new MediaRecorder(); // Additional MediaRecorder setup (output format ... etc.) omitted
recorder.setOutputFile(pfd.getFileDescriptor());
recorder.prepare();
recorder.start();
However this only sends the Video File Data over the wire. You can save it and then play it back. But as mentioned, it is not a stream, yet.
UPDATE:
You do not even have to use a TCP Socket for the first step. I just tripped over "LocalSocket"(1), that also gets you a FileHandle to feed the MediaRecorder. Those Local sockets are "AF_LOCAL/UNIX domain stream socket"s. See http://developer.android.com/reference/android/net/LocalSocket.html
I have not tried all the above myself as of today, but will pretty soon. So maybe I can be of more help in the near future :)
(1) LocalSocket is not usable on newer Android versions for security reasons! See Update from 2015-11-25.
UPDATE 2:
Just saw in the Android Sources the "OUTPUT_FORMAT_RTP_AVP". But it is hidden :( So I guess it will be available in future API versions of Android.
https://github.com/android/platform_frameworks_base/blob/master/media/java/android/media/MediaRecorder.java Line 219:
public static final int OUTPUT_FORMAT_RTP_AVP = 7;
I have not tried just tricking the hide by providing a hardcoded 7 ... If anybody does, please leave a comment here!
UPDATE 2015-11-25
I just ran into libstreaming: https://github.com/fyhertz/libstreaming
I did not look into it too deeply, but it seems there is a lot to be learned about streaming from Android from this project (if not only using it). I read there that the LocalSocket solution is invalid for newer Android versions :( But they present an alternative: ParcelFileDescriptor.
Related
I am currently using an app that uses the method exemplified on libstreaming-example-1 (libstreaming) to stream the camera from an Android Device to an Ubuntu Server (using openCV and libVLC). This way, my Android device acts like a Server and waits for the Client (Ubuntu Server) to send the play signal over RTSP and then start the streaming over UDP.
The problem I am facing with the streaming is that I am getting a delay of approximately 1.1s during the transmission and I want to get it down to 150ms maximum.
I tried to implement the libstreaming-example-2 of libstreaming-examples, but I couldn't I don't have access to a detailed documentation and I couldn't figure out how to get the right signal to display the streaming on my server. Other than that, I was trying to see what I can do with the example 1 in order to get it down, but nothing new until now.
PS: I am using a LAN, so network/bandwidth is not the problem.
Here come the questions:
Which way is the best to get the lowest latency possible while
streaming video from the camera?
How can I implement example-2?
Is example-2 method of streaming better to get the latency down to
150ms?
Is this latency related to the decompression of the video on
the server side? (No frames are dropped, FPS: 30)
Thank you!
had same issue as you with huge stream delay (around 1.5 - 1.6 sec)
My setup is Android device which streams its camera over RTSP using libStreaming, receiving side is Android device using libVlc as media player. Now I found a solution to decrease delay to 250-300 ms. It was achieved by setting up libVlc with following parameters.
mLibvlc = new LibVLC();
mLibvlc.setVout(LibVLC.VOUT_ANDROID_WINDOW);
mLibvlc.setDevHardwareDecoder(LibVLC.DEV_HW_DECODER_AUTOMATIC);
mLibvlc.setHardwareAcceleration(LibVLC.HW_ACCELERATION_DISABLED);
mLibvlc.setNetworkCaching(150);
mLibvlc.setFrameSkip(true);
mLibvlc.setChroma("YV12");
restartPlayer();
private void restartPlayer() {
if (mLibvlc != null) {
try {
mLibvlc.destroy();
mLibvlc.init(this);
} catch (LibVlcException lve) {
throw new IllegalStateException("LibVLC initialisation failed: " + LibVlcUtil.getErrorMsg());
}
}
}
You can play with setNetworkCaching(int networkCaching) to customize a bit delay
Please let me know if it was helpful for you or you found better solution with this or another environment.
My new surveillance camera just arrived, so I'm trying to write an app to live stream the video from it.
Since it came with basically no documentation, I installed the 'onvifer' android app which allows you to browse the camera's capabilities. This app works fine - gets the video and allows PTZ controls, etc. It reports the streaming url as:
rtsp://192.1.0.193:554/mpeg4
I tested the stream in the VLC windows client, and it's able to stream video from that URL as well. This makes me comfortable that the network is working OK.
The camera states the feed will be 1920x1080; VLC confirms this.
The basic code in my activity:
VideoView videoView = (VideoView)this.findViewById(R.id.VideoView);
videoView.setVideoURI(Uri.parse("rtsp://192.1.0.193:554/mpeg4"));
videoView.requestFocus();
videoView.start();
I've also given the app INTERNET permissions in AndroidManifest.xml, disabled authentication on the camera, and am running on a real device (not the emulator).
When I run the app, LogCat shows this immediately:
setDataSource IOException happend :
java.io.FileNotFoundException: No content provider: rtsp://192.1.0.193:554/mpeg4
at android.content.ContentResolver.openTypedAssetFileDescriptor (ContentResolver.java).
About 15 seconds later, the app shows a "Can't play this video" modal dialog box and this is added to LogCat:
MediaPlayer error (100, 0)
AudioSystem AudioFlinger server died!
MediaPlayer error (100, 0)
VideoView Error: 100,0
I've googled everything I can think of, but haven't found anything useful.
Any thoughts?
wild-ass-guess on your logcat and the RC=100... No SDP file or no equivalent for RTSP of the 'moov atom' block required to negotiate details of the stream /container/ codec/ format... You can get the AOSP code for mediaPlayer/videoView and grep the RC value in the source.
RTSP is gnarly to debug ( note the tools links ) and not assured to run inside a NAT'd network due to UDP issues. So, to get better result, you may have to look into forcing your config to do data channel on TCP an not UDP. Or it could be other issues , of which there are many.
If you really want to investigate, some possible tools below:
Use command line and CURL client to request your stream:
Android - Java RTSP Session Mgmt package on Git
Protocol dumps for CLI RTSP sessions to Youtube RTSP/SDP streams
To pursue the issue, you may need to get into the weeds with debug tools that track details of the protocol negotiation that preceeds the MediaPlayer actually starting play on the stream. That would include learning the RFP and the protocol details.
videoView.setVideoURI(“rtsp://192.1.0.193:554/mpeg4”);
Try your app on another phone.
You may find the problem is about the mobile device.
Try this
path:"rtsp://218.204.223.237:554/mobile/1/4C024DFE77DC717D/onnuvesj43xj7t26.sdp".
See whether the code has something wrong.
I'm creating an Android application of live video streaming between two android phone. I've already established a socket connection between these devices. I'm capturing video in one device and send the stream to other device but currently I just want to save in the receiver side mobile device and save it. I'm recording using MediaRecorder in one device , so to stream to the receiver I,m using parcelfiledescriptor object by setting the data
Client side code
mediaRecorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER);
mediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
mediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
mediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
mediaRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.H263);
mediaRecorder.setOutputFile(pfd.getFileDescriptor());
Receiver side code
pfd= ParcelFileDescriptor.fromSocket(s);
InputStream in = new FileInputStream(pfd.getFileDescriptor());
DataInputStream clientData = new DataInputStream(in);
OutputStream newDatabase = new FileOutputStream(file);
int available=in.available();
byte[] buffer = new byte[available];
int length;
while((length = in.read(buffer)) > 0)
{
newDatabase.write(buffer, 0, length);
}
newDatabase.close();
The video file is being created on the receiver side mobile, but it's not able to receive any bytes. So Do I've to decode the coming stream on the receiver side since the video stream sent is encoded while recording. So how can I decode the stream that is received ? I found some solution like MediaExtractor and MediaCodec...but will this work with live video capturing and moreover I'm testing on android version 2.3.6 GingerBread
Is it possible to decode the video stream from MediaCodec for version 2.3.6 or some other method is available ?
The video file is being created on the receiver side mobile, but it's not able to receive any bytes.
If I understand you right, you are getting no data from the socket. That is a separate problem, which has nothing to do with the video format, decoding or encoding.
To debug your sockets, it may be helpful to use a separate application which just dumps the recieved data. Once the data looks fine, you can go to the next step - decoding the video.
Second part of the problem is the video format. You are using mp4, which is not usable for streaming. Here is more info about the format structure. You can use mp4 to record a video into a local file and then transfer the whole file over socket somewhere, but true realtime streaming cannot be done because of the non-seekable nature of the socket (as described in the linked article). There is a block of metadata at the beginning of the file, which acts as a "table of contents" and without it, the previous data are just junk. The problem is, you can assemble a "table of contents" only after you got all the contents. But at that moment, the data was already sent through the socket and you cannot insert anything at its beginning.
There are few walkarounds, but that's just for your future research and I haven't used them yet.
The most intuitive way would be to switch from mp4 to mpeg-ts, a container designed for streaming. Take a look at a hidden constant in MediaRecorder.OutputFormat with value 8.
Another option is to pack the raw H.264 data into RTP/RTCP packets, which is again a protocol designed for streaming. Also your application would be able to stream to any device that support this protocol (for example a PC running VLC). To further reasearch, take a look at Spydroid IP camera, which does exactly the thing.
I want to create an Android application that is capable of receiving an audio stream. I thought of using the A2DP profile, but is seems as if Android doesn't support A2DP sink. Looks like there are a lot of people that's searching for a solution for this problem. But what about receiving an ordinary bit stream, and then convert the data into audio in the application? I was thinking of receiving an PCM or Mp3 data stream via the RFCOMM (SPP Bluetooth profile), and then play it using AudioTrack.
First, how do I receive a bit stream on my Android phone via the RFCOMM? And is it possible to receive a bit stream via RFCOMM as a PCM or Mp3 stream?
Second, if it isn't possible to receive a bit stream via RFCOMM as a PCM or Mp3 stream, how do I convert the received bit stream into audio?
Third, how do I convert the received data into audio AND play the audio simultaneously, in "real time"? Can I just use onDataReceived?
To be clear, I'm not interested of using the A2DP profile! I want to stream the data via the RFCOMM (SPP Bluetooth profile). The received data stream will be in PCM or Mp3. I thought of writing my own app, but if anyone knows of an app to solve this I'd be glad to hear about it! I'm using Android 2.3 Gingerbread.
/Johnny
No. Trying to write an Android application that handles this will not be the solution. At least if you want to use A2DP Sink role.
The fact is that Android, as you mentioned it, does not implement the API calls to BlueZ (the bluetooth stack Android uses till Jelly Bean 4.1) regarding A2DP sink capabilities. You have to implement them yourself. I will try to guide you, as I was also interested in doing this my self in the near past.
Your bluetooth-enabled Android device is advertising itself as an A2DP source device by default. You have to change this first, so nearby devices may recognize your device as a sink. To do this, you must modify the audio.conf file (usally located in /etc/bluetooth/) and make sure the Enable key exists and the value Source is attached to this key, so you will get something like :
Enable=Source
Reboot, nearby devices should now recognize your device as an A2DP sink.
Now you will have to interact with BlueZ to react appropriately when an A2DP source device will start to stream audio to your phone.
Android and BlueZ are talking to each other via D-BUS. In fact, Android connects to the DBUS_SYSTEM channel and listens to every BlueZ advertisement, such as events, file descriptors ...
I remember having successfully bound my self using a native application to this d-bus channel and got access to the various events BlueZ was posting. This is relatively easy to achieve using as reference, the BlueZ API available here. If you go this way, you will have to build a native application (C/C++) and compile it for your platform. You must be able to do this using the Android NDK.
If you find it difficult to use D-BUS, you can try this Java library I just found that handles the communication to D-BUS for you : http://jbluez.sourceforge.net/. I have never used it but it is worth a try in my opinion.
What you really have to do is find out when an A2DP source device is paired to your phone and when he starts to stream music. You can retrieve these events through D-BUS. Once somebody will try to stream music, you need to tell BlueZ that your native application is going to handle it. There is a pretty good document that explains the flow of events that you should handle to do this. This document is accessible here. The part you're interested in comes on page 7. The sink application in the given example is PulseAudio but it could be your application as well.
BlueZ will forward you a UNIX socket when you will call the org.bluez.MediaTransport.Acquire method. Reading on this socket will give you the data that are currently streamed by the remote device. But I remember having been told by a guy working on the BlueZ stack that the data read on this socket are not PCM pure audio, but encoded audio content instead. The data are generally encoded in a format called SBC (Low Complexity Subband Coding).
Decoding SBC is not very difficult, you can find a decoder right here.
The ultimate step would be to forward the PCM audio to your speakers.
To prevent you from getting stuck and in order to test your application in an easier manner, you can use the d-bus binary that should be available on your Android system. He is located in /system/bin.
Quick tests you can make before doing anything of the above might be :
Get Devices list :
dbus-send --system --dest=org.bluez --print-reply /
org.bluez.Manager.GetProperties
This returns an array of adapters with their paths. Once you have these path(s) you can retrieve the list of all the bluetooth devices paired with your adapter(s).
Get paired devices :
dbus-send --system --print-reply --dest=org.bluez
/org/bluez/{pid}/hci0 org.bluez.Adapter.GetProperties
This gives you the list of paired devices whithin the Devices array field.
Once you have the list of devices paired to your Bluetooth Adapter, you can know if it is connected to the AudioSource interface.
Get the devices connected to the AudioSource interface :
dbus-send --system --print-reply --dest=org.bluez
/org/bluez/{pid}/hci0/dev_XX_XX_XX_XX_XX_XX
org.bluez.AudioSource.GetProperties
org.bluez.Manager.GetProperties
Hope this helps.
Another work around is using HandsFreeProfile.
in Android, BluetoothHeadset is working on that.
Wait until status changed to BluetoothHeadset.STATE_AUDIO_CONNECTED.
then you can record audio from bluetooth headset.
mMediaRecorder = new MediaRecorder();
mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
mMediaRecorder.setOutputFile(mFilename);
mMediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
try {
mMediaRecorder.prepare();
} catch (IllegalStateException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
mMediaRecorder.start();
[Irrelevant but works] This hack serves only mp3 streaming via WIFI hotspot (I use it in my car which has only AUX input):
Install the app AirSong,
Turn on wifi hotspot,
Connect the other device to that hotspot,
Access 192.168.43.1:8088 from the device's browser and you are on.
(wondering why "192.168.43.1" only? because thats the default gateway of any device connected to Android Hotspot)
audio.conf seems to be missing in Android 4.2.2?
To receive pcm audio stream via rfcomm , you can use code flow as a hint explained (Reading Audio file in C and forwarding over bluetooth to play in Android Audio track) , with a change . change freq used while initializing from 44100 to 22050
AudioTrack track = new AudioTrack(AudioManager.STREAM_MUSIC,22050,AudioFormat.CHANNEL_OUT_MONO,AudioFormat.ENCODING_PCM_8BIT,10000, AudioTrack.MODE_STREAM);
note:This streaming still consists some noise but your
"receiving an PCM data stream via the RFCOMM (SPP Bluetooth profile), and then play it using AudioTrack."
will work.
to recognize speech by Google server, I use SpeechRecognizer class in combination with RecognitionListener as suggested in Stephan's answer to this question . In addition, I try to capture the audio signal being recognized by using onBufferReceived() callback from RecognitionListener like:
byte[] sig = new byte[500000] ;
int sigPos = 0 ;
...
public void onBufferReceived(byte[] buffer) {
System.arraycopy(buffer, 0, sig, sigPos, buffer.length) ;
sigPos += buffer.length ;
}
...
This seems working fine, except when SpeechRecognizer fails connecting to the Google server, when a chunk of audio is not copied into the above-mentioned sig array, and an HTTP connection time-out exception is thrown. SpeechRecognizer eventually connects to the Google server and recognition results indicate that a complete audio signal was received; only the sig array is missing some audio chunk(s).
Does anybody experience the same problem? Any hint for solution? Thank you!
I tend to say this might be a inconsistency in the behavior of the recognition service, maybe even a bug in the Android version you use. However, the documentation states, that it is not guaranteed that this method is called so it would fit into the specification. What I noticed so far is the following (on Android 2.3.4): I get the bytes while recording, but if there is for example a SocketTimeout it tries to resend the data to the server after some time, but without calling onBufferReceived again for the same data. The code used to test that was the same as the one you have linked in your posting.
Why do you think some chunks are missing from the audio you received in the method? If it were only a few chunks missing, it might even be the case, that the recognition worked although those chunks were missing.
In modern versions onBufferReceieved does not work, you can check record/save audio from voice recognition intent instead.
Best way to achieve this is round the other way. Capture your audio data using the AudioRecord, (I'd recommend using VOICE_COMMUNICATION rather than MIC as an input so you get really clean audio), then pass it through to the SpeechRecognizer. :)