How to play the video streamed from Android device to Server? - android

I have an application that streaming Video/Audio from Android device to Server
Streaming are fine but when I save the streamed data fro MediaRecoreder I can't play the file
Android code :
String hostname = "000.000.000.000";
int port = 0000;
Socket socket = null;
try {
socket = new Socket(InetAddress.getByName(hostname), port);
} catch (UnknownHostException e1) {
e1.printStackTrace();
} catch (IOException e1) {
e1.printStackTrace();
}
ParcelFileDescriptor pfd = ParcelFileDescriptor.fromSocket(socket);
MediaRecorder recorder = new MediaRecorder();
recorder.setAudioSource(MediaRecorder.AudioSource.MIC);
recorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
recorder.setOutputFile(pfd.getFileDescriptor());
recorder.start();
server side :
Socket userSocket = socket.accept();
//DataInputStream dis;
dis = new DataInputStream(userSocket.getInputStream());
while(true){
dis.read(buf , 0 , buf.length);
saveBufferToFile(buf);
}
Now the I save the buffer using FileOutStream .write(); method but the out put file can't be played at all.
after research I understand that I need to add the mp4 headers to the file before I write the data on it BUT I don't know how to do this !?
Regards,

Firstly, the MPEG4 container is more than just "headers" to video data.
Secondly, the server side code does not look correct to me. I assume it is Java. Specifically:
dis = new DataInputStream(userSocket.getInputStream());
while(true){
dis.read(buf , 0 , buf.length);
saveBufferToFile(buf);
}
Specifically, DataInputStream.read() does not necessarily read the entire length of the buffer. The JavaDocs say:
An attempt is made to read as many as len bytes, but a smaller number
may be read, possibly zero.
I suspect the file you are writing to disk is corrupted. You can confirm this by comparing the number of bytes sent by the client and the size of file. My gut say the file will be significantly larger.
To fix it you'll need to take not of the bytes read in by read(), and only write those to bytes to disk. The rest of the buffer is effectively garbage, ignore it.

Related

Stream video frame Android-to-android

I currently work on an app where I use the phone camera and open CV to process the frames. Now I thought it would be cool to be able to send the frames to another Android client. I thought frame by frame with steamer could work, but don't know how to setup the host and if it's not efficient. Any suggestions?
If you just want to send each frame as a raw set of data you can use sockets.
This code below is old now but it worked fine when last tested - it sends an entire video but you can use the same to send whatever file you want:
//Send the video file to helper over a Socket connection so he helper can compress the video file
Socket helperSocket = null;
try {
Log.d("VideoChunkDistributeTask doInBackground","connecting to: " + helperIPAddress + ":" + helperPort);
helperSocket = new Socket(helperIPAddress, helperPort);
BufferedOutputStream helperSocketBOS = new BufferedOutputStream(helperSocket.getOutputStream());
byte[] buffer = new byte[4096];
//Write the video chunk to the output stream
//Open the file
File videoChunkFile = new File(videoChunkFileName);
BufferedInputStream chunkFileIS = new BufferedInputStream(new FileInputStream(videoChunkFile));
//First send a long with the file length - wrap the BufferedOutputStream in a DataOuputStream to
//allow us send a long directly
DataOutputStream helperSocketDOS = new DataOutputStream(
new BufferedOutputStream(helperSocket.getOutputStream()));
long chunkLength = videoChunkFile.length();
helperSocketDOS.writeLong(chunkLength);
Log.d("VideoChunkDistributeTask doInBackground","chunkLength: " + chunkLength);
//Now loop through the video chunk file sending it to the helper via the socket - note this will simply
//do nothing if the file is empty
int readCount = 0;
int totalReadCount = 0;
while(totalReadCount < chunkLength) {
//write the buffer to the output stream of the socket
readCount = chunkFileIS.read(buffer);
helperSocketDOS.write(buffer, 0, readCount);
totalReadCount += readCount;
}
Log.d("VideoChunkDistributeTask doInBackground","file sent");
chunkFileIS.close();
helperSocketDOS.flush();
} catch (UnknownHostException e) {
Log.d("VideoChunkDistributeTask doInBackground","unknown host");
e.printStackTrace();
return null;
} catch (IOException e) {
Log.d("VideoChunkDistributeTask doInBackground","IO exceptiont");
e.printStackTrace();
return null;
}
The full source code is at: https://github.com/mickod/ColabAndroid/tree/master/src/com/amodtech/colabandroid
You may also find there are more up to date socket libraries available which might be better for you to use, but the general principles should be similar.
If you want to stream your video so that the other app can play it like a regular video it streams from the web, then you would want to set up a web server on the 'sending' device. At this point it might be easier to send it to a server and stream from there instead.

Android MediaRecorder and FileDescriptor

I am developing an app that will allow users to cast the screen of an android phone directly to another phone. It is based on libstreaming library with a few modifications.
I have setup MediaRecorder, but the output is not sent to a file but to a file descriptor. I have two file descriptors created with ParcelFileDescriptor.createPipe();
mMediaRecorder = new MediaRecorder();
...
FileDescriptor fd = mParcelWrite.getFileDescriptor();
mMediaRecorder.setOutputFile(fd);
... here I continue with prepare, start, etc.
Then the output is read in the second file descriptor. The stream is processed so the headers that don't exist yet in the mp4 file can be skipped but the video can be streamed.
InputStream is = new ParcelFileDescriptor.AutoCloseInputStream(mParcelRead);
try {
byte buffer[] = new byte[4];
// Skip all atoms preceding mdat atom
while (!Thread.interrupted()) {
while (is.read() != 'm') ;
is.read(buffer, 0, 3);
if (buffer[0] == 'd' && buffer[1] == 'a' && buffer[2] == 't') break;
}
} catch (IOException e) {
Log.e(TAG, "Couldn't skip mp4 header :/");
stop();
throw e;
}
(After this point the stream is sent over the network)
The thing is that apparently after API 23, android doesn't allow non seekable file descriptors.
Any idea of how can I overcome this problem?
Thanks.

Transferring large amounts of data over bluetooth on Android Gingerbread

I'm trying to transfer about a megabyte of arbitrary data at a time from one android phone to another. Currently, I write the size, a command code and the data to a DataOutputStream around a BufferedOutputStream, around the OutputStream returned from bluetoothSocketInstance.getOutputStream().
The receiving phone reads the size and command code and then reads from the input stream until it has gotten all the data it is expecting. This works for short strings, but for larger files not all the data is transferred. Running the app in the debugger shows that the write returns without any exceptions and the read reads a fraction of the bytes expected and then blocks indefinitely. It also does not throw any exceptions.
Is there a buffer somewhere that is filling up? Is there something else I need to do to ensure that all the data gets transferred?
My code for the sender and receiver are below:
Sender:
try {
DataOutputStream d = new DataOutputStream(new BufferedOutputStream(mmOutStream,buffer.length+8));
//int b= buffer.length;
d.writeInt(buffer.length);
d.writeInt(command);
d.write(buffer);
d.flush();
} catch (IOException e) {
Log.e(TAG, "Exception during write", e);
}
}
Receiver:
try {
// Read from the InputStream
int messageSize= inStream.readInt();
int messageCode = inStream.readInt();
bytes=0;
buffer =new byte[messageSize];
while(bytes < messageSize)
{
bytes += inStream.read(buffer,bytes,messageSize - bytes);
}
message = bytes;
} catch (IOException e) {
Log.e(TAG, "disconnected", e);
connectionLost();
break;
}
After some more testing on my end, I changed my sending code to look like this:
for(int i=0; i<buffer.length;i+=BIG_NUM)
{
int b = ((i+BIG_NUM) < buffer.length) ? BIG_NUM: buffer.length - i;
d.write(buffer,i,b);
d.flush();
}
The files now get sent. Does anyone have an idea why? Does the call to flush() block until the data has actually been transferred? Is there any documentation about the size of the send and receive buffers that would help me to decide how large I can safely make BIG_NUM?
I have similar problem, when sending file there are some parts missing. I try BufferedOutputStream but problem still exist.
Finally i find simple solution:
You don't need to send buffer length, just split sending buffer to byte array (for example [8192]) and on receive side make sure that this buffer is much bigger about 4 or 8 times than sending buffer. This worked for me and file is sent completed.

Android AudioRecord send over RTP

Background
I am creating a VoIP app. I know that there are plenty of ones out already, but I have my reasons. Due to commercial implications I cannot just fork SipDroid, although it is a quality app. This app is aimed at Level 10 Gingerbread 2.3.3.
Problem
I have created a simple Activity which creates an AudioRecord instance, and then begins a loop:
int timestamp = 0;
int seqNr = 12;
while(true) {
byte[] buffer = new byte[bufferSize];
int num = recorder.read(buffer, 0, bufferSize);
try {
byte[] pcm = new byte[bufferSize];
//
// presumably here I convert the byte[] from PCM into G711??
//
RTPStream.Write(pcm,seqNr,timestamp);
timestamp += num;
seqNr++;
} catch (IOException e) {
e.printStackTrace();
}
}
Question
How do I turn the PCM 44KHz 16bit Mono byte[]'s into G711u/a byte[]'s??
AudioGroup is available internally. That is what Native SipAudioCall is using. There is a a way to use internal API. Knowing that the class will be available in API 12. You should use it.
Try using AudioStram instead. Set codec via setCodec(AudioCodec) and acquire audio via AudioGroup.

Using AudioTrack in Android to play a WAV file

I'm working with Android, trying to make my AudioTrack application play a Windows .wav file (Tada.wav). Frankly, it shouldn't be this hard, but I'm hearing a lot of strange stuff. The file is saved on my phone's mini SD card and reading the contents doesn't seem to be a problem, but when I play the file (with parameters I'm only PRETTY SURE are right), I get a few seconds of white noise before the sound seems to resolve itself into something that just may be right.
I have successfully recorded and played my own voice back on the phone -- I created a .pcm file according to the directions in this example:
http://emeadev.blogspot.com/2009/09/raw-audio-manipulation-in-android.html
(without the backwards masking)...
Anybody got some suggestions or awareness of an example on the web for playing a .wav file on an Android??
Thanks,
R.
I stumbled on the answer (frankly, by trying &^#! I didn't think would work), in case anybody's interested... In my original code (which is derived from the example in the link in the original post), the data is read from the file like so:
InputStream is = new FileInputStream (file);
BufferedInputStream bis = new BufferedInputStream (is, 8000);
DataInputStream dis = new DataInputStream (bis); // Create a DataInputStream to read the audio data from the saved file
int i = 0; // Read the file into the "music" array
while (dis.available() > 0)
{
music[i] = dis.readShort(); // This assignment does not reverse the order
i++;
}
dis.close(); // Close the input stream
In this version, music[] is array of SHORTS. So, the readShort() method would seem to make sense here, since the data is 16-bit PCM... However, on the Android that seems to be the problem. I changed that code to the following:
music=new byte[(int) file.length()];//size & length of the file
InputStream is = new FileInputStream (file);
BufferedInputStream bis = new BufferedInputStream (is, 8000);
DataInputStream dis = new DataInputStream (bis); // Create a DataInputStream to read the audio data from the saved file
int i = 0; // Read the file into the "music" array
while (dis.available() > 0)
{
music[i] = dis.readByte(); // This assignment does not reverse the order
i++;
}
dis.close(); // Close the input stream
In this version, music[] is an array of BYTES. I'm still telling the AudioTrack that it's 16-bit PCM data, and my Android doesn't seem to have a problem with writing an array of bytes into an AudioTrack thus configured... Anyway, it finally sounds right, so if anyone else wants to play Windows sounds on their Android, for some reason, that's the solution. Ah, Endianness......
R.
I found a lot of long answers to this question. My final solution, which given all the cutting and pasting is hardly mine, comes down to:
public boolean play() {
int i = 0;
byte[] music = null;
InputStream is = mContext.getResources().openRawResource(R.raw.noise);
at = new AudioTrack(AudioManager.STREAM_MUSIC, 44100,
AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT,
minBufferSize, AudioTrack.MODE_STREAM);
try{
music = new byte[512];
at.play();
while((i = is.read(music)) != -1)
at.write(music, 0, i);
} catch (IOException e) {
e.printStackTrace();
}
at.stop();
at.release();
return STOPPED;
}
STOPPED is just a "true" sent back as a signal to reset the pause/play button.
And in the class initializer:
public Mp3Track(Context context) {
mContext = context;
minBufferSize = AudioTrack.getMinBufferSize(44100,
AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT);
}
Context is just "this" from the calling activity.
You can use a FileInputStream on the sdcard, etc. My files are in res/raw
Are you skipping the first 44 bytes of the file before you dump the rest of the file's data into the buffer? The first 44 bytes are the WAVE header and they would sound like random noise if you tried to play them.
Also, are you sure you are creating the AudioTrack with the same properties as the WAVE you are trying to play (sample rate, bit rate, number of channels, etc)? Windows actually does a good job of giving this information to you in the File Properties page:
As said by Aaron C, you have to skip initial 44 bytes or (as I prefer) read first 44 bytes that are the WAVE header. In this way you know how many channels, bits per sample, length, etc... the WAVE contains.
Here you can find a good implementation of a WAVE header parser/writer.
Please don't perpetuate terrible parsing code. WAV parsing is trivial to implement
http://soundfile.sapp.org/doc/WaveFormat/
and you will thank yourself by being able to parse things such as the sampling rate, bit depth, and number of channels.
Also x86 and ARM (at least by default) are both little endian , so native-endian WAV files should be fine without any shuffling.
Just confirm if you have AudioTrack.MODE_STREAM and not AudioTrack.MODE_STATIC in the AudioTrack constructor:
AudioTrack at = new AudioTrack(
AudioManager.STREAM_MUSIC,
sampleRate,
AudioFormat.CHANNEL_IN_STEREO,
AudioFormat.ENCODING_PCM_16BIT,
// buffer length in bytes
outputBufferSize,
AudioTrack.MODE_STREAM
);
Sample wav file:
http://www.mauvecloud.net/sounds/pcm1644m.wav
Sample Code:
public class AudioTrackPlayer {
Context mContext;
int minBufferSize;
AudioTrack at;
boolean STOPPED;
public AudioTrackPlayer(Context context) {
Log.d("------","init");
mContext = context;
minBufferSize = AudioTrack.getMinBufferSize(44100,
AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT);
}
public boolean play() {
Log.d("------","play");
int i = 0;
byte[] music = null;
InputStream is = mContext.getResources().openRawResource(R.raw.pcm1644m);
at = new AudioTrack(AudioManager.STREAM_MUSIC, 44100,
AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT,
minBufferSize, AudioTrack.MODE_STREAM);
try {
music = new byte[512];
at.play();
while ((i = is.read(music)) != -1)
at.write(music, 0, i);
} catch (IOException e) {
e.printStackTrace();
}
at.stop();
at.release();
return STOPPED;
}
}

Categories

Resources