Play audio from array of bytes in android? - android

I want to stream live audio from one device to many devices . I am recording my voice in android and while its recording i am sending bytes to server and again receiving those bytes on different devices what i am getting is array of bytes and i am getting so many array of bytes every second . Now want to play those bytes as audio . media player require file to play but i cant save it into file because data is still coming i am very confused either i am doing it in wrong way . Actua i want to made two apps in one app we speak something and in another app we can listen what is someone speaking at that side in real time .

The AudioTrack class allows streaming of PCM audio buffers, via write (byte[] audioData, int offsetInBytes, int sizeInBytes) (among other methods).

Related

Distinguish source when capturing audio with AudioRecord

The AudioRecord class allows recording of phone calls with one of the following options as the recording source:
VOICE_UPLINK: The audio transmitted from your end to the other party. IOW, what you speak into the microphone.
VOICE_DOWNLINK: The audio transmitted from the other party to your end.
VOICE_CALL: VOICE_UPLINK + VOICE_DOWNLINK.
I'd like to build an App that records both VOICE_UPLINK & VOICE_DOWNLINK and identify the source of the voice.
When using VOICE_CALL as the AudioSource option, the UP/DOWN-LINK streams are bundled together in to the received data buffer which makes it hard to identify the source of the voice.
Using two AudioRecords with VOICE_UPLINK & VOICE_DOWNLINK does not work - the second AudioRecord fails to start because the first AudioRecord locks the recording stream.
Is there any creative way to bypass the locking problem presented at case (2), thus enable recording of the VOICE_UPLINK & VOICE_DOWNLINK streams simultaneously and easily identifying the source?

How to decode the H.264 video stream received from parcelfiledescriptor

I'm creating an Android application of live video streaming between two android phone. I've already established a socket connection between these devices. I'm capturing video in one device and send the stream to other device but currently I just want to save in the receiver side mobile device and save it. I'm recording using MediaRecorder in one device , so to stream to the receiver I,m using parcelfiledescriptor object by setting the data
Client side code
mediaRecorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER);
mediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
mediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
mediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
mediaRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.H263);
mediaRecorder.setOutputFile(pfd.getFileDescriptor());
Receiver side code
pfd= ParcelFileDescriptor.fromSocket(s);
InputStream in = new FileInputStream(pfd.getFileDescriptor());
DataInputStream clientData = new DataInputStream(in);
OutputStream newDatabase = new FileOutputStream(file);
int available=in.available();
byte[] buffer = new byte[available];
int length;
while((length = in.read(buffer)) > 0)
{
newDatabase.write(buffer, 0, length);
}
newDatabase.close();
The video file is being created on the receiver side mobile, but it's not able to receive any bytes. So Do I've to decode the coming stream on the receiver side since the video stream sent is encoded while recording. So how can I decode the stream that is received ? I found some solution like MediaExtractor and MediaCodec...but will this work with live video capturing and moreover I'm testing on android version 2.3.6 GingerBread
Is it possible to decode the video stream from MediaCodec for version 2.3.6 or some other method is available ?
The video file is being created on the receiver side mobile, but it's not able to receive any bytes.
If I understand you right, you are getting no data from the socket. That is a separate problem, which has nothing to do with the video format, decoding or encoding.
To debug your sockets, it may be helpful to use a separate application which just dumps the recieved data. Once the data looks fine, you can go to the next step - decoding the video.
Second part of the problem is the video format. You are using mp4, which is not usable for streaming. Here is more info about the format structure. You can use mp4 to record a video into a local file and then transfer the whole file over socket somewhere, but true realtime streaming cannot be done because of the non-seekable nature of the socket (as described in the linked article). There is a block of metadata at the beginning of the file, which acts as a "table of contents" and without it, the previous data are just junk. The problem is, you can assemble a "table of contents" only after you got all the contents. But at that moment, the data was already sent through the socket and you cannot insert anything at its beginning.
There are few walkarounds, but that's just for your future research and I haven't used them yet.
The most intuitive way would be to switch from mp4 to mpeg-ts, a container designed for streaming. Take a look at a hidden constant in MediaRecorder.OutputFormat with value 8.
Another option is to pack the raw H.264 data into RTP/RTCP packets, which is again a protocol designed for streaming. Also your application would be able to stream to any device that support this protocol (for example a PC running VLC). To further reasearch, take a look at Spydroid IP camera, which does exactly the thing.

Unexpected noise when using audiotrack + audiorecord class in android

I am doing an Android IP phone application with android 2.1 version.
My application is to provide a simple ip phone function.My program consist of a listener thread which receive command and poll user to start the call.
The audiotrack class will receive audio data and playback
while it will record audio data with audiorecord and stream it out to the other side.
When there is only one user streaming to another, the sound quality is good.However, when the receiver side also start record and stream, both side ear weird sound and loud noise.But still both sides can hear what the others said.
Is it not suitable for using audiotrack and audiorecord class on the same side?I cannot figure out the problem. Can anyone suggest any solution?

Raw audio packets to WAV/GSM_MS compliant file on Android

I'm looking for the logic/code-snippet which can convert my raw audio packets to WAV/GSM_MS complaint audio file. I'm able to capture data from android device mic and store it in buffer or file.
Assuming your raw data is already in interleaved, All you need is to prepend wave header in the beginning. The wave header format is given here https://ccrma.stanford.edu/courses/422/projects/WaveFormat/
When you create a new wave file always write the header (with data length field set to zero as you dont know the entire size of data you wish to write at the at beginning of recording) then start writing your data immediately after the header, once you are done writing the data to it seek to the beginning and update the data length field.
here http://www.codeproject.com/Articles/129173/Writing-a-Proper-Wave-File is a code for the same.

CallBack for the recorded block in MediaRecorder

I am trying to record a voice from Mic using Media Recorder class. in the mentioned class we have just setOutputFile method to set the output file, but I need to get a buffer of some certain recorded voice, I mean i need something like a CallBack method that return a block of recorded byte at that time and i am going to send the mentioned bytes to another device...
Actually I want to stream and send the recorded voice through socket to another device simultaneously not saving the recorded voice and then read the file and send it, due to it results an unexpected delay...
Alireza,
This can be done pretty easily. All you have to do is set up a socket, from that socket you create a ParcelFileDescriptor, then set this file descriptor in setOutputFile. This will set up the streaming part, but then you will have some formatting issues with the file afterwards. This is because MediaRecorder reserves the header space of the file, but only writes it after the stream has finished. In order to have a functional file on the server-side, you will have to parse the header, and write it to the beginning of the file (or buffer).
Good luck,
B-Rad

Categories

Resources