I have searched and found a code of voice streaming or audio calling between two android phones on simple sockets. I have implemented this , but this code is not working. I am unable to hear any voice.
Receiver Code :
private int sampleRate = 44100;
private int channelConfig = AudioFormat.CHANNEL_CONFIGURATION_MONO;
private int audioFormat = AudioFormat.ENCODING_PCM_16BIT;
int minBufSize = AudioRecord.getMinBufferSize(sampleRate, channelConfig, audioFormat);
public void startReceiving() {
Thread receiveThread = new Thread (new Runnable() {
#Override
public void run() {
try {
DatagramSocket socket = new DatagramSocket(50005);
Log.d("VR", "Socket Created");
byte[] buffer = new byte[256];
speaker = new AudioTrack(AudioManager.STREAM_MUSIC,sampleRate,channelConfig,audioFormat,minBufSize,AudioTrack.MODE_STREAM);
while(status == true) {
try {
DatagramPacket packet = new DatagramPacket(buffer,buffer.length);
socket.receive(packet);
Log.d("VR", "Packet Received");
//reading content from packet
buffer=packet.getData();
Log.d("VR", "Packet data read into buffer");
//sending data to the Audiotrack obj i.e. speaker
speaker.write(buffer, 0, minBufSize);
Log.d("VR", String.valueOf(buffer));
speaker.play();
} catch(IOException e) {
Log.e("VR","IOException");
}
}
} catch (SocketException e) {
Log.e("VR", "SocketException");
}
}
});
receiveThread.start();
}
Sender Code:
public void startStreaming() {
Thread streamThread = new Thread(new Runnable() {
#Override
public void run() {
try {
DatagramSocket socket = new DatagramSocket();
Log.d("VS", "Socket Created");
byte[] buffer = new byte[minBufSize];
Log.d("VS","Buffer created of size " + minBufSize);
DatagramPacket packet;
final InetAddress destination = InetAddress.getByName("192.168.0.216");
Log.d("VS", "Address retrieved");
recorder = new AudioRecord(MediaRecorder.AudioSource.MIC,sampleRate,channelConfig,audioFormat,minBufSize*10);
Log.d("VS", "Recorder initialized");
recorder.startRecording();
while(status == true) {
//reading data from MIC into buffer
minBufSize = recorder.read(buffer, 0, buffer.length);
//putting buffer in the packet
packet = new DatagramPacket (buffer,buffer.length,destination,port);
socket.send(packet);
}
} catch(UnknownHostException e) {
Log.e("VS", "UnknownHostException");
} catch (IOException e) {
e.printStackTrace();
Log.e("VS", "IOException");
}
}
});
streamThread.start();
}
I debugged the code , and packets are successfully transmitting and speaker.play() is also called. But there is no Voice
I have implemented this code in a single application and activity. With two Buttons Start Listening and Start Streaming
getMinBufferSize() may not be the same for AudioRecord and AudioTrack (learned that the hard way). Make sure you are using the larger of the two.
Everything is working fine for me... just want to ask that how to send packets on public IP like 119.43.214.5. I have made two apps and they can send packets on localhost Client have got the IP Address of server app. Just the problem is that IP is a public IP and client is not sending data on that app.
Related
l want to send live camera from phone 1 to phone 2 via wifi. What is the easiest way to do this ?
I Can sending my voice from phone 1 to phone 2 successful. but I can not send live camera to phon2.
I found " libstreaming " and " Sipdroid " that Used for communication stream(audio or live camera ) Between two devices. But I do not understand them And it is very difficult.
Is there an easier way to do it ?
the code of voice receive
Thread receiveThread = new Thread (new Runnable() {
#Override
public void run() {
//minimum buffer size. need to be careful. might cause problems. try setting manually if any problems faced
int minBufSize =AudioRecord.getMinBufferSize(sampleRate, channelConfig, audioFormat)*2;
speaker = new AudioTrack(AudioManager.STREAM_MUSIC ,sampleRate,channelConfig,audioFormat,1600,AudioTrack.MODE_STREAM);
speaker.play();
try {
DatagramSocket socket1 = new DatagramSocket(9000);
log2( "Socket Created");
byte[] buffer = new byte[1600];
while( true) {
DatagramPacket packet = new DatagramPacket(buffer,1600);
socket1.receive(packet);
log2("Packet data read into buffer");
speaker.write(packet.getData(), 0, packet.getLength());
log2("Writing buffer content to speaker");
}
} catch (SocketException e) {
log2("SocketException");
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
});
receiveThread.start();
the code of voice send
Thread streamThread = new Thread(new Runnable() {
#Override
public void run() {
try {
int minBufSize = AudioRecord.getMinBufferSize(sampleRate, channelConfig, audioFormat)*10;
log2("Socket Created");
DatagramPacket packet;
final InetAddress destination = InetAddress.getByName("192.168.49.1");
recorder = new AudioRecord(MediaRecorder.AudioSource.MIC,sampleRate,AudioFormat.CHANNEL_CONFIGURATION_MONO,AudioFormat.ENCODING_PCM_16BIT,minBufSize);
log2("Recorder initialized");
byte[] buffer = new byte[1600];
recorder.startRecording();
int bytes_read = 0;
DatagramSocket socket1 = new DatagramSocket();
while(true) {
//reading data from MIC into buffer
bytes_read = recorder.read(buffer, 0, 1600);
//putting buffer in the packet
packet = new DatagramPacket (buffer, bytes_read,destination,9000);
socket1.send(packet);
}
} catch (IOException e) {
//Log.e("VS", "IOException");
}
}
});
streamThread.start();
The above code is for Send and receive voice.. But i want a simple way for sending camera live from phone 1 to another... please help me..
I'm trying to build an audio recorder app for Android Wear. Right now, I'm able to capture the audio on the watch, stream it to phone and save it on a file. However, the audio file is presenting gaps or cropped parts.
I found this aswered questions related to my problem link1, link2, but they couldn't help me.
Here is my code:
First, on the watch side, I create the channel using the channelAPI and sucessfully send the audio being captured on the watch to the smartphone.
//here are the variables values that I used
//44100Hz is currently the only rate that is guaranteed to work on all devices
//but other rates such as 22050, 16000, and 11025 may work on some devices.
private static final int RECORDER_SAMPLE_RATE = 44100;
private static final int RECORDER_CHANNELS = AudioFormat.CHANNEL_IN_MONO;
private static final int RECORDER_AUDIO_ENCODING = AudioFormat.ENCODING_PCM_16BIT;
int BufferElements2Rec = 1024;
int BytesPerElement = 2;
//start the process of recording audio
private void startRecording() {
recorder = new AudioRecord(MediaRecorder.AudioSource.MIC,
RECORDER_SAMPLE_RATE, RECORDER_CHANNELS,
RECORDER_AUDIO_ENCODING, BufferElements2Rec * BytesPerElement);
recorder.startRecording();
isRecording = true;
recordingThread = new Thread(new Runnable() {
public void run() {
writeAudioDataToPhone();
}
}, "AudioRecorder Thread");
recordingThread.start();
}
private void writeAudioDataToPhone(){
short sData[] = new short[BufferElements2Rec];
ChannelApi.OpenChannelResult result = Wearable.ChannelApi.openChannel(googleClient, nodeId, "/mypath").await();
channel = result.getChannel();
Channel.GetOutputStreamResult getOutputStreamResult = channel.getOutputStream(googleClient).await();
OutputStream outputStream = getOutputStreamResult.getOutputStream();
while (isRecording) {
// gets the voice output from microphone to byte format
recorder.read(sData, 0, BufferElements2Rec);
try {
byte bData[] = short2byte(sData);
outputStream.write(bData);
} catch (IOException e) {
e.printStackTrace();
}
}
try {
outputStream.close();
} catch (IOException e) {
e.printStackTrace();
}
}
Then, on the smartphone side, I receive the audio data from the channel and write it to a PCM file.
public void onChannelOpened(Channel channel) {
if (channel.getPath().equals("/mypath")) {
Channel.GetInputStreamResult getInputStreamResult = channel.getInputStream(mGoogleApiClient).await();
inputStream = getInputStreamResult.getInputStream();
writePCMToFile(inputStream);
MainActivity.this.runOnUiThread(new Runnable() {
public void run() {
Toast.makeText(MainActivity.this, "Audio file received!", Toast.LENGTH_SHORT).show();
}
});
}
}
public void writePCMToFile(InputStream inputStream) {
OutputStream outputStream = null;
try {
// write the inputStream to a FileOutputStream
outputStream = new FileOutputStream(new File("/sdcard/wearRecord.pcm"));
int read = 0;
byte[] bytes = new byte[1024];
while ((read = inputStream.read(bytes)) != -1) {
outputStream.write(bytes, 0, read);
}
System.out.println("Done writing PCM to file!");
} catch (Exception e) {
e.printStackTrace();
} finally {
if (inputStream != null) {
try {
inputStream.close();
} catch (Exception e) {
e.printStackTrace();
}
}
if (outputStream != null) {
try {
// outputStream.flush();
outputStream.close();
} catch (Exception e) {
e.printStackTrace();
}
}
}
}
What am I doing wrong or what are your suggestions to achieve a perfect gapless audio file on the smartphone? Thanks in advance.
I noticed in your code that you are reading everything into a short[] array, and then converting it to a byte[] array for the Channel API to send. Your code also creates a new byte[] array through each iteration of the loop, which will create a lot of work for the garbage collector. In general, you want to avoid allocations inside loops.
I would allocate one byte[] array at the top, and let the AudioRecord class store it directly into the byte[] array (just make sure you allocate twice as many bytes as you did shorts), with code like this:
mAudioTemp = new byte[bufferSize];
int result;
while ((result = mAudioRecord.read(mAudioTemp, 0, mAudioTemp.length)) > 0) {
try {
mAudioStream.write(mAudioTemp, 0, result);
} catch (IOException e) {
Log.e(Const.TAG, "Write to audio channel failed: " + e);
}
}
I also tested this with a 1 second audio buffer, using code like this, and it worked nicely. I'm not sure what the minimum buffer size is before it starts to have problems:
int bufferSize = Math.max(
AudioTrack.getMinBufferSize(44100, AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_16BIT),
44100 * 2);
I'm developing voice chat App.streaming voice between client and server.
i had tried with Datagramsocket but has a lota problems.
I'M trying with Tcp this time. but i couldn't transfer data.what's wrong with my code?
client :
public void startStreaming() {
Thread streamThread = new Thread(new Runnable() {
#Override
public void run() {
try {
int minBufSize =AudioRecord.getMinBufferSize(sampleRate, channelConfig, audioFormat);
Log.d("VS", "Socket Created.c");
byte[] buffer = new byte[256];
Log.d("VS","Buffer created of size .c" + minBufSize);
DatagramPacket packet;
final InetAddress destination = InetAddress.getByName(target.getText().toString());
port=Integer.parseInt(target_port.getText().toString());
Socket socket=new Socket(destination,port);
Log.d("VS", "Address retrieved.c");
Log.d("VS", "");
if (minBufSize != AudioRecord.ERROR_BAD_VALUE) {
recorder = new AudioRecord(MediaRecorder.AudioSource.MIC,sampleRate,channelConfig,audioFormat,minBufSize*=10);
Log.d("VS", "Recorder initialized.c");
if (recorder.getState() == AudioRecord.STATE_INITIALIZED)
recorder.startRecording();}
while(status == true) {
//reading data from MIC into buffer
minBufSize = recorder.read(buffer, 0, buffer.length);
Log.d("", ""+buffer.length);
//encoding to base64
// String buffer1= Base64.encodeToString(buffer, Base64.DEFAULT);
//putting buffer in the packet
ObjectOutputStream out = new ObjectOutputStream(socket.getOutputStream());
out.writeObject(buffer);
// packet = new DatagramPacket (buffer,buffer.length,destination,port);
// socket.send(packet);
}
} catch(UnknownHostException e) {
e.printStackTrace();
} catch (IOException e) {
Log.e("IOException message:",e.getMessage().toString());
}
}
});
streamThread.start();
}
server :
public void startReceiving() {
Thread receiveThread = new Thread (new Runnable() {
#Override
public void run() {
try {
int minBufSize =4096;//recorder.getMinBufferSize(sampleRate,channelConfig,audioFormat);
ServerSocket serversocket = new ServerSocket(50005);
// DatagramSocket socket = new DatagramSocket(50005);
byte[] buffer = new byte[2560];
if (minBufSize != AudioRecord.ERROR_BAD_VALUE) {
speaker = new AudioTrack(AudioManager.STREAM_MUSIC,sampleRate,channelConfig,audioFormat,minBufSize,AudioTrack.MODE_STREAM);
speaker.play();
Log.d("VR", "spekaer playing...");
}
// }
//minimum buffer size. need to be careful. might cause problems. try setting manually if any problems faced
// int minBufSize = AudioRecord.getMinBufferSize(sampleRate, channelConfig, audioFormat);
Log.d("VR", ""+status);
InputStream is;
ObjectInputStream ois;
while(status == true) {
//DatagramPacket packet = new DatagramPacket(buffer,buffer.length);
socket = serversocket.accept();
is = socket.getInputStream();
ois = new ObjectInputStream(is);
// socket.receive(packet);
Log.d("VR", "Packet Received.s");
//reading content from packet
// buffer=packet.getData();
Log.d("VR", "Packet data read into buffer.s");
//sending data to the Audiotrack obj i.e. speaker
speaker.write( ois.toString().getBytes(), 0, minBufSize);
Log.d("VR", "Writing buffer content to speaker.s");
}
} catch (SocketException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
i think the problem is transfering the voice.
please give me your best help
client side :
BufferedWriter input;
while(status == true) {
int bufferReadResult = recorder.read(buffer, 0, buffer.length);
dos.write(buffer,0,bufferReadResult);
dos.flush();
}
I am developing an application which enables user to make a voice call within a LAN (WI-FI) by streaming a raw pcm audio stream.
but I am unable to play the audio track being received
here is my receiver code:
private AudioTrack speaker;
public void startReceiving() {
Thread receiveThread = new Thread (new Runnable() {
#Override
public void run() {
try {
DatagramSocket socket = new DatagramSocket(1127);
Log.d("VR", "Socket Created");
byte[] buffer = new byte[256];
//minimum buffer size. need to be careful. might cause problems. try setting manually if any problems faced
int minBufSize = AudioRecord.getMinBufferSize(sampleRate, channelConfig, audioFormat);
speaker = new AudioTrack(AudioManager.STREAM_MUSIC,sampleRate,channelConfig,audioFormat,minBufSize,AudioTrack.MODE_STREAM);
speaker.play();
while(status==true) {
try {
DatagramPacket packet = new DatagramPacket(buffer,buffer.length);
socket.receive(packet);
Log.d("VR", "Packet Received");
//reading content from packet
buffer=packet.getData();
Log.d("VR", "Packet data read into buffer");
//sending data to the Audiotrack obj i.e. speaker
speaker.write(buffer, 0, minBufSize);
Log.d("VR", "Writing buffer content to speaker");
} catch(IOException e) {
Log.e("VR","IOException");
}
}
speaker.play();
} catch (SocketException e) {
Log.e("VR", "SocketException");
}
}
});
receiveThread.start();
}
and here is my sender code:
public void startStreaming()
{
Thread streamThread = new Thread(new Runnable(){
#Override
public void run()
{
try{
DatagramSocket socket = new DatagramSocket();
Log.d("VS", "Socket Created");
byte[] buffer = new byte[minBufSize];
Log.d("VS", "Buffer created of size " + minBufSize);
Log.d("VS", "Address retrieved");
recorder = new AudioRecord(MediaRecorder.AudioSource.MIC,sampleRate,channelConfig,audioFormat,minBufSize);
Log.d("VS", "Recorder initialized");
recorder.startRecording();
InetAddress IPAddress = InetAddress.getByName("192.168.0.101");
byte[] sendData = new byte[1024];
byte[] receiveData = new byte[1024];
while (status==true)
{
DatagramPacket sendPacket = new DatagramPacket(sendData, sendData.length, IPAddress, 1127);
socket.send(sendPacket);
}
} catch(UnknownHostException e) {
Log.e("VS", "UnknownHostException");
} catch (IOException e) {
Log.e("VS", "IOException");
e.printStackTrace();
}
}
});
streamThread.start();
}
I cant use SIP , its just a plain audio streaming.
can any one help me with this please
thanks.
You have two issues:
Your receiver thread doesn't seem to play... it is always in the loop and never reaches speaker.play().
Your sending thread never sends any data... For example:
recorder.read(sendData, 0, sendData.length);
I'm trying to broadcast a audio file from my android device via udp to the local wifi and have the clients on the local network listen to it via VLC's network streaming option. I can broadcast it and recieve it on anydevice connected to the network if i use my own recieve code, But i want VLC to be able to play it. Is ther any specific encoding or formatting that needs to be done before I send the datagram packet?
My sending code
public void SendAudio()
{
Thread thrd = new Thread(new Runnable() {
#Override
public void run()
{
Log.e(LOG_TAG, "start send thread, thread id: "
+ Thread.currentThread().getId());
long file_size = 0;
int bytes_read = 0;
int bytes_count = 0;
File audio = new File(AUDIO_FILE_PATH);
FileInputStream audio_stream = null;
file_size = audio.length();
byte[] buf = new byte[BUF_SIZE];
try
{
InetAddress addr = InetAddress.getByName("192.168.1.255");
DatagramSocket sock = new DatagramSocket();
while(true){
bytes_count=0;
audio_stream = new FileInputStream(audio);
while(bytes_count < file_size)
{
bytes_read = audio_stream.read(buf, 0, BUF_SIZE);
DatagramPacket pack = new DatagramPacket(buf, bytes_read,
addr, AUDIO_PORT);
sock.send(pack);
bytes_count += bytes_read;
Log.d(LOG_TAG, "bytes_count : " + bytes_count);
Thread.sleep(SAMPLE_INTERVAL, 0);
}
}
}
catch (InterruptedException ie)
{
Log.e(LOG_TAG, "InterruptedException");
}
catch (FileNotFoundException fnfe)
{
Log.e(LOG_TAG, "FileNotFoundException");
}
catch (SocketException se)
{
Log.e(LOG_TAG, "SocketException");
}
catch (UnknownHostException uhe)
{
Log.e(LOG_TAG, "UnknownHostException");
}
catch (IOException ie)
{
Log.e(LOG_TAG, "IOException");
}
} // end run
});
thrd.start();
}
My Recieving Code`
public void RecvAudio()
{
Thread thrd = new Thread(new Runnable() {
#Override
public void run()
{
Log.e(LOG_TAG, "start recv thread, thread id: "
+ Thread.currentThread().getId());
AudioTrack track = new AudioTrack(AudioManager.STREAM_MUSIC,
SAMPLE_RATE, AudioFormat.CHANNEL_CONFIGURATION_MONO,
AudioFormat.ENCODING_PCM_16BIT, BUF_SIZE,
AudioTrack.MODE_STREAM);
track.play();
try
{
DatagramSocket sock = new DatagramSocket(AUDIO_PORT);
byte[] buf = new byte[BUF_SIZE];
while(true)
{
DatagramPacket pack = new DatagramPacket(buf, BUF_SIZE);
sock.receive(pack);
Log.d(LOG_TAG, "recv pack: " + pack.getLength());
track.write(pack.getData(), 0, pack.getLength());
}
}
catch (SocketException se)
{
Log.e(LOG_TAG, "SocketException: " + se.toString());
}
catch (IOException ie)
{
Log.e(LOG_TAG, "IOException" + ie.toString());
}
} // end run
});
thrd.start();
}
Once again, using this i can send this from one android device and listen from another just fine using the recieve code ive given, but i want to play it using vlc's get network stream command and listen to h77p://#:port and get the audio playing.
Tnx again :)
You can use the code from this tutorial:
StrictMode.ThreadPolicy policy = new StrictMode.ThreadPolicy.Builder().permitAll().build();
StrictMode.setThreadPolicy(policy);
try {
AudioManager audio = (AudioManager) getSystemService(Context.AUDIO_SERVICE);
audio.setMode(AudioManager.MODE_IN_COMMUNICATION);
AudioGroup audioGroup = new AudioGroup();
audioGroup.setMode(AudioGroup.MODE_NORMAL);
AudioStream audioStream = new AudioStream(InetAddress.getByAddress(getLocalIPAddress ()));
audioStream.setCodec(AudioCodec.PCMU);
audioStream.setMode(RtpStream.MODE_NORMAL);
//set receiver(vlc player) machine ip address(please update with your machine ip)
audioStream.associate(InetAddress.getByAddress(new byte[] {(byte)192, (byte)168, (byte)1, (byte)19 }), 22222);
audioStream.join(audioGroup);
} catch (Exception e) {
Log.e("----------------------", e.toString());
e.printStackTrace();
}
}
public static byte[] getLocalIPAddress () {
byte ip[]=null;
try {
for (Enumeration en = NetworkInterface.getNetworkInterfaces(); en.hasMoreElements();) {
NetworkInterface intf = en.nextElement();
for (Enumeration enumIpAddr = intf.getInetAddresses(); enumIpAddr.hasMoreElements();) {
InetAddress inetAddress = enumIpAddr.nextElement();
if (!inetAddress.isLoopbackAddress()) {
ip= inetAddress.getAddress();
}
}
}
} catch (SocketException ex) {
Log.i("SocketException ", ex.toString());
}
return ip;
}