I have a question that I want to read bytes from video resided in sdcard in chunk size 1024,
means I have to read 1024 bytes from the file at a time. I am able to fetch number of bytes from the video but I can't get it in chunks, I don't know how to achieve this. Please suggest me the right solution regarding the same.
Thanks in advance.
import java.io.*;
public class FileUtil {
private final int BUFFER_SIZE = 1024;
public void readFile(String fileName) {
BufferedInputStream in = null;
try {
in = new BufferedInputStream(new FileInputStream(fileName));
} catch (FileNotFoundException e) {
e.printStackTrace();
return;
}
byte[] buffer = new byte[BUFFER_SIZE];
try {
int n = 0;
while ((n = in.read(buffer, 0, BUFFER_SIZE)) > 0) {
/* do whatever you want with buffer here */
}
}
catch(Exception e) {
e.printStackTrace();
}
finally { // always close input stream
try {
in.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
}
Based on the code from http://www.xinotes.org/notes/note/648/
Related
I use JNI to develop my app, and there are two .dat files used as input files in C++ layer. At present, I push these two files into mobile devices through adb before open related app. I think there is a better solution to prevent from pushing two files into mobile devices.
after trying several solutions, I have solved it by combining three solutions, and the code shown below. Before using the code, you need to create a folder called "assets" parallel to "res" folder. Through this way, you can attach the input file you may use to the apk, and when installing the apk first time, it will automatically store the files to specific path in the target device.
public class CameraPreviewActivity extends AppCompatActivity
implements CameraPermissionHelper.CameraPermissionCallback {
SharedPreferences prefs = null;
public static String TAG = "CameraPreviewActivity";
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
prefs = getSharedPreferences("com.yourcompany.yourapp", MODE_PRIVATE);
}
#Override
protected void onDestroy() {
super.onDestroy();
}
#Override
protected void onResume() {
super.onResume();
if(prefs.getBoolean("firstrun", true)){
prefs.edit().putBoolean("firstrun", false).commit();
try {
final InputStream input = getResources().getAssets().open("face_model.dat");
try {
File UPLOAD_DIR = new File("/sdcard");
File file = new File(UPLOAD_DIR, "face_model.dat");
OutputStream output = new FileOutputStream(file);
try {
try {
byte[] buffer = new byte[4 * 1024]; // or other buffer size
int read;
while ((read = input.read(buffer)) != -1) {
output.write(buffer, 0, read);
}
output.flush();
} finally {
output.close();
}
} catch (Exception e) {
e.printStackTrace(); // handle exception, define IOException and others
}
} finally {
input.close();
}
}catch (IOException e){
e.printStackTrace();
}
try {
final InputStream input = getResources().getAssets().open("shape_pred.dat");
try {
File UPLOAD_DIR = new File("/sdcard");
File file = new File(UPLOAD_DIR, "shape_pred.dat");
OutputStream output = new FileOutputStream(file);
try {
try {
byte[] buffer = new byte[4 * 1024]; // or other buffer size
int read;
while ((read = input.read(buffer)) != -1) {
output.write(buffer, 0, read);
}
output.flush();
} finally {
output.close();
}
} catch (Exception e) {
e.printStackTrace(); // handle exception, define IOException and others
}
} finally {
input.close();
}
}catch (IOException e){
e.printStackTrace();
}
}
}
}
I have a socket that recieves images via one InputStream that doesnt get closed. I want to send images continiously that way. But now the images get recieved with a delay of 1 image (the first one after I sent the second one, the second one after I sent the third one, ....). What am I doing wrong?
Server
public static void readImages(InputStream stream) throws IOException {
stream = new BufferedInputStream(stream);
BufferedImage image = null;
int j = 0;
while (true) {
stream.mark(MAX_IMAGE_SIZE);
ImageInputStream imgStream = ImageIO.createImageInputStream(stream);
Iterator<ImageReader> i = ImageIO.getImageReaders(imgStream);
if (!i.hasNext()) {
System.out.println("No more image readers");
break;
}
ImageReader reader = i.next();
reader.setInput(imgStream);
image = reader.read(0);
ImageIO.write(image,"jpg",new File("current" + j + ".jpg"));
System.out.println("Save an image " + j);
if (image == null) {
System.out.println("Image is null");
break;
}
long bytesRead = imgStream.getStreamPosition();
stream.reset();
stream.skip(bytesRead);
j++;
}
}
Client
new Thread(new Runnable() {
#Override
public void run() {
try {
OutputStream outputStream = server.getOutputStream();
ByteArrayOutputStream stream = new ByteArrayOutputStream();
bitmapToSend = Bitmap.createScaledBitmap(bitmapToSend, 900, 800, true);
bitmapToSend.compress(Bitmap.CompressFormat.JPEG, 100, stream);
byte[] byteArray = stream.toByteArray();
outputStream.write(byteArray);
outputStream.flush();
} catch (IOException e) {
System.out.println("Socket not created");
e.printStackTrace();
}
}
}).start();
Note I dont close the output stream of the client, so I can send pictures all the time.
Using ImageIO.getImageReaders(imgStream) doesn't seem logical for socket streams, since it probably expects all images to be available at once. That may be the reason of your delay.
Secondly, for decompressing images there is a simple method BitmapFactory.decodeStream().
Thirdly, since client already creates a "JPG" format, the server just needs to store it. You only need to send at start the number of bytes and zero after all files have been sent.
Client:
new Thread(new Runnable() {
#Override
public void run() {
try {
ByteArrayOutputStream memoryStream = new ByteArrayOutputStream();
Bitmap bitmapToSend =
Bitmap.createScaledBitmap(bitmapToSend, 900, 800, true);
bitmapToSend.compress(
Bitmap.CompressFormat.JPEG,100, memoryStream);
byte[] byteArray = memoryStream.toByteArray();
memoryStream = null;
DataOutputStream outputStream =
new DataOutputStream(server.getOutputStream());
outputStream.writeInt(byteArray.length);
outputStream.write(byteArray);
outputStream.flush();
} catch (IOException e) {
System.out.println("Socket not created");
e.printStackTrace();
}
}
}).start();
Server:
public static void readImages(InputStream stream) {
DataInputStream imgInput = new DataInputStream(stream);
int index = 0;
int byteLength;
try {
while ((byteLength = imgInput.readInt())>0) {
byte[] buffer = new byte[byteLength];
imgInput.readFully(buffer);
OutputStream imgOutput = new FileOutputStream("current" + (index++) + ".jpg");
imgOutput.write(buffer);
imgOutput.close();
}
} catch (IOException ex) {
// .............
} finally {
try {
imgInput.close();
} catch (IOException ex1) {
//...........
}
}
}
I want to transfer a file on Socket connection using Wi-Fi Hotspot IP address and MAC address between two android devices.I am able to transfer a text file with less than 1 KB size but unable to send other extension files and of bigger size using socket. Below is the code for Sender side:-
Socket socket = null;
File file = new File(
Environment.getExternalStorageDirectory(),
"test.mp3");
byte[] bytes = new byte[(int) file.length()];
BufferedInputStream bis;
try {
socket = new Socket(dstAddress, dstPort);
bis = new BufferedInputStream(new FileInputStream(file));
bis.read(bytes, 0, bytes.length);
OutputStream os = socket.getOutputStream();
os.write(bytes, 0, bytes.length);
os.flush();
if (socket != null) {
socket.close();
}
final String sentMsg = "File Sent.....";
((Activity)context_con).runOnUiThread(new Runnable() {
#Override
public void run() {
Toast.makeText(context_con,
sentMsg,
Toast.LENGTH_LONG).show();
}});
}catch (ConnectException e) {
e.printStackTrace();
}
catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
try {
if (socket != null) {
socket.close();
}
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
This is the code for Receiver End:-
try {
File file = new File(
Environment.getExternalStorageDirectory(),
"test.mp3");
byte[] bytes = new byte[1024];
InputStream is = socket.getInputStream();
FileOutputStream fos = new FileOutputStream(file);
BufferedOutputStream bos = new BufferedOutputStream(fos);
int bytesRead = is.read(bytes, 0, bytes.length);
bos.write(bytes, 0, bytesRead);
bos.close();
socket.close();
final String sentMsg = "File Received...";
Main.this.runOnUiThread(new Runnable() {
#Override
public void run() {
Toast.makeText(Main.this,
sentMsg,
Toast.LENGTH_LONG).show();
}});
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} finally {
try {
socket.close();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
I want to transfer bigger size files like mp3 file but it only creating 1Kb size file on receiver end not with the exact size which is 2.1 MB. Please help me where I am wrong in this implementation.
Put together this but haven't tested it. This should give some hints
//Server side snippet
public void server(int port) throws IOException {
try (ServerSocket serverSocket = new ServerSocket(port); Socket socket = serverSocket.accept()) {
try (InputStream in = socket.getInputStream(); OutputStream out = new FileOutputStream("test.mp3")) {
byte[] bytes = new byte[2 * 1024];
int count;
while ((count = in.read(bytes)) > 0) {
out.write(bytes, 0, count);
}
}
}
}
//client side
public static void client(String dstAddress, int dstPort) throws IOException {
try (Socket socket = new Socket(dstAddress, dstPort)) {
File file = new File(Environment.getExternalStorageDirectory(), "test.mp3");
// Get the size of the file
long length = file.length();
if (length > 0) {
byte[] bytes = new byte[2 * 1024];
InputStream in = new FileInputStream(file);
OutputStream out = socket.getOutputStream();
int count;
while ((count = in.read(bytes)) > 0) {
out.write(bytes, 0, count);
}
out.close();
in.close();
}
}
}
You could choose to wrap the resources with try-catch as i've done or you can choose not to. Consider adjusting the buffer size accordingly. Consider this
Please try that.
Can I record sound from a microphone in a file at the same time when running YandexSpeechKit Recognizer?
There was a need for simultaneous speech recognition (using the class Recognizer) and recording sound from a devices microphone to a file. Use a standard mechanism MediaRecord is not possible, because MediaRecord and YandexSpeechKit used native methods and the same resource. It is causing the fall of some of the processes (MediaRecord or Recognizer).
I'm trying use RecognizerListener -> onSoundDataRecorded(Recognizer recognizer, byte[] bytes) code is bellow:
#Override
public void onSoundDataRecorded(Recognizer recognizer, byte[] bytes) {
Logger.d(TAG, "onSoundDataRecorded");
write(bytes);
}
public void write(byte[] bytes) {
File file = getTmpFile();
FileOutputStream fos = null;
try {
fos = new FileOutputStream(file, true);
fos.write(bytes);
} catch (IOException e1) {
e1.printStackTrace();
} finally {
if(fos != null) {
try {
fos.flush();
fos.close();
} catch(IOException e) {
}
}
}
}
But while the resulting file is not possible to play.
Can somebody help me?
Thanks!
Yandex SpeechKit returns raw PCM (16 kHz mono 16 bit) data. You should add WAV header or play as PCM. For example in unix-like OS via sox:
play -r 16000 -b 16 -c 1 -e signed-integer filename.pcm
For adding WAV header you can use this class https://github.com/MohammadAG/Android-SoundRecorder/blob/master/src/com/mohammadag/soundrecorder/WavConverter.java with parameters
private static final long SAMPLE_RATE = 16000;
private static final int RECORDER_BPP = 16;
private static final int CHANNELS = 1;
private static final long BYTE_RATE = RECORDER_BPP * SAMPLE_RATE * CHANNELS/8;
#Override
public void onRecognizerRecordingBegin() {
try {
tempFileName = getFilename();
os = new FileOutputStream(tempFileName, true);
} catch (FileNotFoundException e) {
e.printStackTrace();
}
}
#Override
public void onRecognizerRecordingDone() {
try {
os.close();
} catch (IOException e) {
e.printStackTrace();
}
int bufferSize = AudioRecord.getMinBufferSize(
16000, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);
WavConverter.copyWaveFile(tempFileName, getFilename(), bufferSize);
deleteTempFile();
}
#Override
public void onRecognizerSoundDataRecorded(byte[] bytes) {
try {
os.write(bytes);
} catch (IOException e) {
e.printStackTrace();
}
}
I'm trying to build an audio recorder app for Android Wear. Right now, I'm able to capture the audio on the watch, stream it to phone and save it on a file. However, the audio file is presenting gaps or cropped parts.
I found this aswered questions related to my problem link1, link2, but they couldn't help me.
Here is my code:
First, on the watch side, I create the channel using the channelAPI and sucessfully send the audio being captured on the watch to the smartphone.
//here are the variables values that I used
//44100Hz is currently the only rate that is guaranteed to work on all devices
//but other rates such as 22050, 16000, and 11025 may work on some devices.
private static final int RECORDER_SAMPLE_RATE = 44100;
private static final int RECORDER_CHANNELS = AudioFormat.CHANNEL_IN_MONO;
private static final int RECORDER_AUDIO_ENCODING = AudioFormat.ENCODING_PCM_16BIT;
int BufferElements2Rec = 1024;
int BytesPerElement = 2;
//start the process of recording audio
private void startRecording() {
recorder = new AudioRecord(MediaRecorder.AudioSource.MIC,
RECORDER_SAMPLE_RATE, RECORDER_CHANNELS,
RECORDER_AUDIO_ENCODING, BufferElements2Rec * BytesPerElement);
recorder.startRecording();
isRecording = true;
recordingThread = new Thread(new Runnable() {
public void run() {
writeAudioDataToPhone();
}
}, "AudioRecorder Thread");
recordingThread.start();
}
private void writeAudioDataToPhone(){
short sData[] = new short[BufferElements2Rec];
ChannelApi.OpenChannelResult result = Wearable.ChannelApi.openChannel(googleClient, nodeId, "/mypath").await();
channel = result.getChannel();
Channel.GetOutputStreamResult getOutputStreamResult = channel.getOutputStream(googleClient).await();
OutputStream outputStream = getOutputStreamResult.getOutputStream();
while (isRecording) {
// gets the voice output from microphone to byte format
recorder.read(sData, 0, BufferElements2Rec);
try {
byte bData[] = short2byte(sData);
outputStream.write(bData);
} catch (IOException e) {
e.printStackTrace();
}
}
try {
outputStream.close();
} catch (IOException e) {
e.printStackTrace();
}
}
Then, on the smartphone side, I receive the audio data from the channel and write it to a PCM file.
public void onChannelOpened(Channel channel) {
if (channel.getPath().equals("/mypath")) {
Channel.GetInputStreamResult getInputStreamResult = channel.getInputStream(mGoogleApiClient).await();
inputStream = getInputStreamResult.getInputStream();
writePCMToFile(inputStream);
MainActivity.this.runOnUiThread(new Runnable() {
public void run() {
Toast.makeText(MainActivity.this, "Audio file received!", Toast.LENGTH_SHORT).show();
}
});
}
}
public void writePCMToFile(InputStream inputStream) {
OutputStream outputStream = null;
try {
// write the inputStream to a FileOutputStream
outputStream = new FileOutputStream(new File("/sdcard/wearRecord.pcm"));
int read = 0;
byte[] bytes = new byte[1024];
while ((read = inputStream.read(bytes)) != -1) {
outputStream.write(bytes, 0, read);
}
System.out.println("Done writing PCM to file!");
} catch (Exception e) {
e.printStackTrace();
} finally {
if (inputStream != null) {
try {
inputStream.close();
} catch (Exception e) {
e.printStackTrace();
}
}
if (outputStream != null) {
try {
// outputStream.flush();
outputStream.close();
} catch (Exception e) {
e.printStackTrace();
}
}
}
}
What am I doing wrong or what are your suggestions to achieve a perfect gapless audio file on the smartphone? Thanks in advance.
I noticed in your code that you are reading everything into a short[] array, and then converting it to a byte[] array for the Channel API to send. Your code also creates a new byte[] array through each iteration of the loop, which will create a lot of work for the garbage collector. In general, you want to avoid allocations inside loops.
I would allocate one byte[] array at the top, and let the AudioRecord class store it directly into the byte[] array (just make sure you allocate twice as many bytes as you did shorts), with code like this:
mAudioTemp = new byte[bufferSize];
int result;
while ((result = mAudioRecord.read(mAudioTemp, 0, mAudioTemp.length)) > 0) {
try {
mAudioStream.write(mAudioTemp, 0, result);
} catch (IOException e) {
Log.e(Const.TAG, "Write to audio channel failed: " + e);
}
}
I also tested this with a 1 second audio buffer, using code like this, and it worked nicely. I'm not sure what the minimum buffer size is before it starts to have problems:
int bufferSize = Math.max(
AudioTrack.getMinBufferSize(44100, AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_16BIT),
44100 * 2);