I'm creating an Android application of live video streaming between two android phone. I've already established a socket connection between these devices. I'm capturing video in one device and send the stream to other device but currently I just want to save in the receiver side mobile device and save it. I'm recording using MediaRecorder in one device , so to stream to the receiver I,m using parcelfiledescriptor object by setting the data
Client side code
mediaRecorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER);
mediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
mediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
mediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
mediaRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.H263);
mediaRecorder.setOutputFile(pfd.getFileDescriptor());
Receiver side code
pfd= ParcelFileDescriptor.fromSocket(s);
InputStream in = new FileInputStream(pfd.getFileDescriptor());
DataInputStream clientData = new DataInputStream(in);
OutputStream newDatabase = new FileOutputStream(file);
int available=in.available();
byte[] buffer = new byte[available];
int length;
while((length = in.read(buffer)) > 0)
{
newDatabase.write(buffer, 0, length);
}
newDatabase.close();
The video file is being created on the receiver side mobile, but it's not able to receive any bytes. So Do I've to decode the coming stream on the receiver side since the video stream sent is encoded while recording. So how can I decode the stream that is received ? I found some solution like MediaExtractor and MediaCodec...but will this work with live video capturing and moreover I'm testing on android version 2.3.6 GingerBread
Is it possible to decode the video stream from MediaCodec for version 2.3.6 or some other method is available ?
The video file is being created on the receiver side mobile, but it's not able to receive any bytes.
If I understand you right, you are getting no data from the socket. That is a separate problem, which has nothing to do with the video format, decoding or encoding.
To debug your sockets, it may be helpful to use a separate application which just dumps the recieved data. Once the data looks fine, you can go to the next step - decoding the video.
Second part of the problem is the video format. You are using mp4, which is not usable for streaming. Here is more info about the format structure. You can use mp4 to record a video into a local file and then transfer the whole file over socket somewhere, but true realtime streaming cannot be done because of the non-seekable nature of the socket (as described in the linked article). There is a block of metadata at the beginning of the file, which acts as a "table of contents" and without it, the previous data are just junk. The problem is, you can assemble a "table of contents" only after you got all the contents. But at that moment, the data was already sent through the socket and you cannot insert anything at its beginning.
There are few walkarounds, but that's just for your future research and I haven't used them yet.
The most intuitive way would be to switch from mp4 to mpeg-ts, a container designed for streaming. Take a look at a hidden constant in MediaRecorder.OutputFormat with value 8.
Another option is to pack the raw H.264 data into RTP/RTCP packets, which is again a protocol designed for streaming. Also your application would be able to stream to any device that support this protocol (for example a PC running VLC). To further reasearch, take a look at Spydroid IP camera, which does exactly the thing.
Related
I want to stream live audio from one device to many devices . I am recording my voice in android and while its recording i am sending bytes to server and again receiving those bytes on different devices what i am getting is array of bytes and i am getting so many array of bytes every second . Now want to play those bytes as audio . media player require file to play but i cant save it into file because data is still coming i am very confused either i am doing it in wrong way . Actua i want to made two apps in one app we speak something and in another app we can listen what is someone speaking at that side in real time .
The AudioTrack class allows streaming of PCM audio buffers, via write (byte[] audioData, int offsetInBytes, int sizeInBytes) (among other methods).
I need to stream audio from external bluetooth device and video from camera to wowza server so that I can then access the live stream through a web app.
I've been able to successfully send other streams to Wowza using the GOCOder library, but as far as I can tell, this library only sends streams that come from the device's camera and mic.
Does anyone have a good suggesting for implementing this?
In the GoCoder Android SDK, the setAudioSource method of WZAudioSource allows you to specify an audio input source other than the default. Here's the relevant API doc for this method:
public void setAudioSource(int audioSource)
Sets the actively configured input device for capturing audio.
Parameters:
audioSource - An identifier for the active audio source. Possible values are those listed at MediaRecorder.AudioSource. The default value is MediaRecorder.AudioSource.CAMCORDER. Note that setting this while audio is actively being captured will have no effect until a new capture session is started. Setting this to an invalid value will cause an error to occur at session begin.
I'm creating an input stream to buffer and stream a mp3 from cloud .
URL url = new URL("http://xxxx.yyy.com/Demo.mp3");
InputStream inputStream = url.openStream();
Now how do i playback the mp3 from media player without using a temporary file to store it and read back from the same ? I'm developing for Android Lollipop
I'm pretty sure the MediaPlayer can handle remote URLs. Take a look at this example. Check the setDataSource method from the MediaPlayer class as well.
EDIT: Since you really really want to use an inputstream, I think you'll need to go low-level. Check the AudioTrack class. This SO answer might help. There are also a couple of issues here and here that might be relevant.
This problem persists even today !!! Check these link out https://code.google.com/p/android/issues/detail?id=29870 and
http://www.piterwilson.com/blog/2014/03/11/android-mediaplayer-not-quite-there-yet/ .
There is absolutely no way either to get access and control over the MediaPlayer buffer , neither to feed the buffered mp3 content stored in an byte array into MediaPplayer as an argument to play it . So People either convert the mp3 buffer to PCM and use AudioTrack to play it or write the byte array of the input stream into a local socket and make Mediaplayer read back using the socket file descriptor like mentioned this following link Audio stream buffering
The solution I'm using to feed binary data directly to MediaPlayer is to use ParcelFileDescriptor#createPipe() (API level 9) and MediaPlayer#setDataSource(java.io.FileDescriptor).
Here's sample code (untested):
ParcelFileDescriptor[] pipe = ParcelFileDescriptor.createPipe();
FileDescriptor fd = pipe[0].getFileDescriptor();
mediaPlayer.setDataSource(fd);
OutputStream out = new ParcelFileDescriptor.AutoCloseOutputStream(pipe[1]);
From this point on, whatever you write in the output stream will be received by the MediaPlayer. This is pretty fast since it uses a kernel FIFO to transfer data (no sockets, no TCP) and as far as I understand is fully in RAM (no actual files are used).
What I want is to broadcast an android camera video to remote locations, for anyone to watch that video on their mobile or website.
I've been succesful to unicast it to the vlc player on my pc.
I tried red5 server, Adobe media server, ffmpeg server but all in vail.
Each of them was only able to broadcast video from a prerecorded file but not from any live stream.
Can any one suggest me what i do.
I read (I think it was even on stackoverflow) that you can provide the MediaRecorder with a FileHandle of a TCP-Connection. Then you can listen to that connection, read the data, packetize it and resend it as a RTSP/RTP-Stream.
If I happen to find the original post, I'll reference it here.
EDIT:
The original Post was: Streaming Video From Android
And the part about the Filedescriptor is from: http://www.mattakis.com/blog/kisg/20090708/broadcasting-video-with-android-without-writing-to-the-file-system
Just in case, I cite the according example from the blog:
String hostname = "your.host.name";
int port = 1234;
Socket socket = new Socket(InetAddress.getByName(hostname), port);
ParcelFileDescriptor pfd = ParcelFileDescriptor.fromSocket(socket);
MediaRecorder recorder = new MediaRecorder(); // Additional MediaRecorder setup (output format ... etc.) omitted
recorder.setOutputFile(pfd.getFileDescriptor());
recorder.prepare();
recorder.start();
However this only sends the Video File Data over the wire. You can save it and then play it back. But as mentioned, it is not a stream, yet.
UPDATE:
You do not even have to use a TCP Socket for the first step. I just tripped over "LocalSocket"(1), that also gets you a FileHandle to feed the MediaRecorder. Those Local sockets are "AF_LOCAL/UNIX domain stream socket"s. See http://developer.android.com/reference/android/net/LocalSocket.html
I have not tried all the above myself as of today, but will pretty soon. So maybe I can be of more help in the near future :)
(1) LocalSocket is not usable on newer Android versions for security reasons! See Update from 2015-11-25.
UPDATE 2:
Just saw in the Android Sources the "OUTPUT_FORMAT_RTP_AVP". But it is hidden :( So I guess it will be available in future API versions of Android.
https://github.com/android/platform_frameworks_base/blob/master/media/java/android/media/MediaRecorder.java Line 219:
public static final int OUTPUT_FORMAT_RTP_AVP = 7;
I have not tried just tricking the hide by providing a hardcoded 7 ... If anybody does, please leave a comment here!
UPDATE 2015-11-25
I just ran into libstreaming: https://github.com/fyhertz/libstreaming
I did not look into it too deeply, but it seems there is a lot to be learned about streaming from Android from this project (if not only using it). I read there that the LocalSocket solution is invalid for newer Android versions :( But they present an alternative: ParcelFileDescriptor.
I am trying to record a voice from Mic using Media Recorder class. in the mentioned class we have just setOutputFile method to set the output file, but I need to get a buffer of some certain recorded voice, I mean i need something like a CallBack method that return a block of recorded byte at that time and i am going to send the mentioned bytes to another device...
Actually I want to stream and send the recorded voice through socket to another device simultaneously not saving the recorded voice and then read the file and send it, due to it results an unexpected delay...
Alireza,
This can be done pretty easily. All you have to do is set up a socket, from that socket you create a ParcelFileDescriptor, then set this file descriptor in setOutputFile. This will set up the streaming part, but then you will have some formatting issues with the file afterwards. This is because MediaRecorder reserves the header space of the file, but only writes it after the stream has finished. In order to have a functional file on the server-side, you will have to parse the header, and write it to the beginning of the file (or buffer).
Good luck,
B-Rad