I want java code to create partition of video of specific size.
e.g. Consider a video of size 20mb and I want peices of 5 mb each. so we get 4 parts.
I have used code below but it only creates .MP4 file it is not creating video file.
public static void divideFile(File f) {
int partCounter = 1;//I like to name parts from 001, 002, 003,
//you can change it to 0 if you want 000, 001,003
int sizeOfFiles = 1024 * 1024;// 1MB
byte[] buffer = new byte[sizeOfFiles];
String fileName = f.getName();
//try-with-resources to ensure closing stream
try (FileInputStream fis = new FileInputStream(f);
BufferedInputStream bis = new BufferedInputStream(fis)) {
int bytesAmount = 0;
while ((bytesAmount = bis.read(buffer)) > 0) {
//write each chunk of data into separate file with different number in name
String filePartName = String.format("%s.%03d", fileName, partCounter++);
File newFile = new File(f.getParent(), filePartName);
try (FileOutputStream out = new FileOutputStream(newFile)) {
out.write(buffer, 0, bytesAmount);
}
}
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
I think you want each part to be a playable video itself - in this case each part needs the correct metadata headers to allow a player to handle it properly.
Breaking the video up just by bytes will mean that metadata is not present or not correct in any of the chunks.
You can use ffmpeg to do this correctly with the following command (for mp4):
ffmpeg -I videoPath -ss startTime -t endTime -c copy outputVideoChunk.mp4
There are several ways to use ffmpeg within an Android app but one of the easiest is to use a well supported wrapper library like:
https://github.com/WritingMinds/ffmpeg-android-java
Related
I am trying to use the /system/bin/screencapture tool in Android in my program. I want to have the screenshot in a Bitmap object. (I know about the other methods, however, my program is using a SurfaceView and I cannot change that so none available on the internet I could find worked.)
I have found that using the -p option to encode it into a png file takes too much time. So I want to use the output without the -p option. However, I am unable to figure out what format that output uses. I have tried reading it into a byte array and using BitmapFactory.decodeByteArray() but that doesn't seem to work (method just returns null.)
TL;DR: What format does /system/bin/screencapture use when not using the -p option (or writing to a file name that ends with ".png")
Here's the relevant code:
Bitmap bitmap = null;
try {
Runtime.getRuntime().exec("/system/bin/screencap /storage/emulated/0/storage/screencap");
try {
Thread.sleep(45);
} catch (InterruptedException e) {
e.printStackTrace();
}
Log.v("findMe", "Finished writing file");
byte[] data = new byte[0];
FileInputStream fis = new FileInputStream("/system/bin/screencap");
while(fis.available() > 0){
data = append_to_byte_arr(data, (byte) fis.read());
}
Log.d("findMe", data.length + " is data.length");
bitmap = BitmapFactory.decodeByteArray(data, 0, data.length);
} catch (IOException ioe) {
Log.e("findMe", "you failed", ioe);
return;
}
//...
private byte[] append_to_byte_arr(byte[] arr, byte item) {
byte[] temp = new byte[arr.length + 1];
System.arraycopy(arr, 0, temp, 0, arr.length);
temp[arr.length] = item;
return temp;
}
Thanks for the help.
Without the "-p" option the format depends about the filename extension: if it is ".png" the screenshot is automatically saved as PNG. Without any filename the screenshot is printed to "stdout". I tried many was and the best is to print out to "stdout" and then read directly the Bitmap pixels from there because the other methods involves the "screencap" to save to file and then your App should read it wasting "a lot of time" (near 1.5 seconds in both operations)
I am using Compressor third party library for compress the captured images size its working fine and now size is showing KB's but when i convert this images to BASE64 file size becomes 6MB or more size showing my code is below can some one help me please what should i do for resolve this issue
code:
File file= new Compressor(this).compressToFile(f);
String base64File = getBase64StringFile(file);
// Converting File to Base64.encode String type using Method
public static String getBase64StringFile(File f) {
InputStream inputStream = null;
String encodedFile= "", lastVal;
try {
inputStream = new FileInputStream(f.getAbsolutePath());
byte[] buffer = new byte[10240];//specify the size to allow
int bytesRead;
ByteArrayOutputStream output = new ByteArrayOutputStream();
Base64OutputStream output64 = new Base64OutputStream(output, Base64.DEFAULT);
while ((bytesRead = inputStream.read(buffer)) != -1) {
output64.write(buffer, 0, bytesRead);
}
output64.close();
encodedFile = output.toString();
}
catch (FileNotFoundException e1 ) {
e1.printStackTrace();
}
catch (IOException e) {
e.printStackTrace();
}
lastVal = encodedFile;
return lastVal;
}
You can resolve this issue using some other Compressor tools like FFMPEG.
Base64 always increase your file size
Base64 is often used on binary data that needs to be transmitted across a system that isn't really designed for binary. Depending on what you're doing, you may not even need to encode it. And per the wikipedia, on average, a file is expected to grow about 37% when you base64 encode it, which is almost exactly what your numbers are.
I've got a rather odd problem. I'm writing an Android application using the Xamarin framework, and I also have an iOS version of the same app also written in Xamarin. In the app the user can send photos and videos to their friends, and their friends may be on either iOS or Android. This all works fine, and videos taken on an iPhone can be played on an Android device and vice versa.
The problem I am having is when I try to programmatically save a video to the Android gallery, then that video is not able to be played in the gallery. It does appear that the video data it's self is actually copied, but the video is somehow not playable.
My videos are encoded to the mp4 format using the H.264 codec. I believe this is fully supported in Android, and like I said the videos play just fine when played via a VideoView in the app.
The code I am using to copy the videos to the gallery is below. Does anyone have any idea what I am doing wrong here?
public static void SaveVideoToGallery(Activity activity, String filePath) {
// get filename from path
int idx = filePath.LastIndexOf("/") + 1;
String name = filePath.Substring(idx, filePath.Length - idx);
// set in/out files
File inFile = new File(filePath);
File outDir = Android.OS.Environment.GetExternalStoragePublicDirectory(Android.OS.Environment.DirectoryMovies);
File outFile = new File(outDir, name);
// Make sure the Pictures directory exists.
outDir.Mkdirs();
// save the file to disc
InputStream iStream = new FileInputStream(inFile);
OutputStream oStream = new FileOutputStream(outFile);
byte[]data = new byte[iStream.Available()];
iStream.Read();
oStream.Write(data);
iStream.Close();
oStream.Close();
// Tell the media scanner about the new file so that it is
// immediately available to the user.
MediaScannerConnection.ScanFile(
activity.ApplicationContext,
new String[] { outFile.ToString() },
null,
null);
}
NOTE: I know this is all in C#, but keep in mind that all the Xamarin framework does is provide an API to the native Android methods. Everything I am using is either Java or Android backed classes/functions.
Thanks!
Your issue is in this code snippet:
byte[]data = new byte[iStream.Available()];
iStream.Read();
oStream.Write(data);
There are a few issues here:
You never read the files contents into the data buffer; iStream.Read() will only read a single byte and return it as an integer.
new byte[iStream.Available()] will only allocate the amount of data bytes that are available to be read without blocking. It isn't the full file. See the docs on the available method.
oStream.Write(data) writes out a garbage block of data as nothing is ever read into it.
The end result is the outputted video file is just a block of empty data hence why the gallery cannot use it.
Fix it reading in the data from the file stream and then writing them into the output file:
int bytes = 0;
byte[] data = new byte[1024];
while ((bytes = iStream.Read(data)) != -1)
{
oStream.Write (data, 0, bytes);
}
Full sample:
public static void SaveVideoToGallery(Activity activity, String filePath) {
// get filename from path
int idx = filePath.LastIndexOf("/") + 1;
String name = filePath.Substring(idx, filePath.Length - idx);
// set in/out files
File inFile = new File(filePath);
File outDir = Android.OS.Environment.GetExternalStoragePublicDirectory(Android.OS.Environment.DirectoryMovies);
File outFile = new File(outDir, name);
// Make sure the Pictures directory exists.
outDir.Mkdirs();
// save the file to disc
InputStream iStream = new FileInputStream(inFile);
OutputStream oStream = new FileOutputStream(outFile);
int bytes = 0;
byte[] data = new byte[1024];
while ((bytes = iStream.Read(data)) != -1)
{
oStream.Write (data, 0, bytes);
}
iStream.Close();
oStream.Close();
// Tell the media scanner about the new file so that it is
// immediately available to the user.
MediaScannerConnection.ScanFile(
activity.ApplicationContext,
new String[] { outFile.ToString() },
null,
null);
}
Is it possible to play two sound (mp3) files at the same time? I have tried using two different MediaPlayer objects-
MediaPlayer mediaPlayer;
MediaPlayer mediaPlayer2;
to play the sounds, but that does not work. I cannot use SoundPool either as the sound files in use are around 10MB each (since SoundPool doesn't work well with sound files > 3MB).
Here is some code to get familiar with my situation-
#Override
public void onResume() {
super.onResume();
if(mediaPlayer == null)
{
mediaPlayer = MediaPlayer.create(getActivity().getApplicationContext(), R.raw.song1);
}
if(mediaPlayer2 == null)
{
mediaPlayer2 = MediaPlayer.create(getActivity().getApplicationContext(), R.raw.song2);
}
}
private void startPlaying() {
mediaPlayer.setLooping(true);
mediaPlayer.start();
mediaPlayer2.start();
}
Any suggestions? Is there some way to make this 2 MediaPlayer objects approach work? If not then what other options are there? Code would be helpful!
So playing two audio files simultaneously is definitely an issue so I thought, another way to look at the problem would be to combine two audio files programmatically into one, and play that. That turned out to be simple to implement with WAV files (MP3 and other compressed formats would need to be uncompressed first?). Anyway, here's how I did it:
InputStream is = getResources().openRawResource(R.raw.emokylotheme); // Name of file 1
byte [] bytesTemp2 = fullyReadFileToBytes(new File(
Environment.getExternalStorageDirectory().getAbsolutePath()+
"/Kylo Ren/"+filename+"_morphed.wav")); // Name of file 2
byte [] sample2 = convertInputStreamToByteArray(is);
byte[] temp2 = bytesTemp2.clone();
RandomAccessFile randomAccessFile2 = new RandomAccessFile(new File(
Environment.getExternalStorageDirectory().getAbsolutePath()+
"/Kylo Ren/"+filename+"_morphed.wav"), "rw");
//seek to skip 44 bytes for WAV formats
randomAccessFile2.seek(44);
for (int n = 0; n < bytesTemp2.length; n++)
{
bytesTemp2[n] = (byte) ((temp2[n] + (sample2[n])));
}
randomAccessFile2.write(bytesTemp2);
randomAccessFile2.close();
And here are the support functions:
public byte[] convertInputStreamToByteArray(InputStream inputStream)
{
byte[] bytes= null;
try
{
ByteArrayOutputStream bos = new ByteArrayOutputStream();
byte data[] = new byte[1024];
int count;
while ((count = inputStream.read(data)) != -1)
{
bos.write(data, 0, count);
}
bos.flush();
bos.close();
inputStream.close();
bytes = bos.toByteArray();
}
catch (IOException e)
{
e.printStackTrace();
}
return bytes;
}
byte[] fullyReadFileToBytes(File f) throws IOException {
int size = (int) f.length();
byte bytes[] = new byte[size];
byte tmpBuff[] = new byte[size];
FileInputStream fis= new FileInputStream(f);
try {
int read = fis.read(bytes, 0, size);
if (read < size) {
int remain = size - read;
while (remain > 0) {
read = fis.read(tmpBuff, 0, remain);
System.arraycopy(tmpBuff, 0, bytes, size - remain, read);
remain -= read;
}
}
} catch (IOException e){
throw e;
} finally {
fis.close();
}
return bytes;
}
In essence, what the code does is: It gets the bytes of the two audio files (one is within the app in R.raw and the other is in the external storage directory), sums them up, and writes the new bytes to another file.
The issue with this code is that it generates some amount of background noise. It isn't a lot but I believe the summing up of the bytes at certain points (maybe extremas?) leads to the noise. If someone knows how to fix this, could you edit the answer and let me know.
P.S. this was for an open source voice changer app with dramatic background noise effects called Kylo Ren Voice Changer (https://github.com/advaitsaravade/Kylo-Ren-Voice-Changer)
I have used two instances of MediaPlayer in a Service. My code is similar to yours. I had some problems but finally solved it. Please check my answer here Unable to play two MediaPlayer at same time in Nexus 5
If you still have problems, please put your full code and the logcat error.
You can also try to create two fragments and each play a sound.
I have another problem if i played an mp3 file, then click the back button and then start the activity again and ican play the same file again. You can will read Android Mediaplayer multiple instances when activity resumes play sound in the same time
i am able to get the path of the picture i want to copy, and able to get the path from where i want it to be copy, but still cant find the way to copy them.
any suggestion?
private void copyPictureToFolder(String picturePath, String folderName)
throws IOException {
Log.d("debug", folderName);
Log.d("debug", picturePath);
try {
FileInputStream fileInputStream = new FileInputStream(picturePath);
FileOutputStream fileOutputStream = new FileOutputStream(folderName+"/");
int bufferSize;
byte[] bufffer = new byte[512];
while ((bufferSize = fileInputStream.read(bufffer)) > 0) {
fileOutputStream.write(bufffer, 0, bufferSize);
}
fileInputStream.close();
fileOutputStream.close();
} catch (Exception e) {
Log.d("disaster","didnt work");
}
}
thanks.
You should use Commons-IO to copy a file, we are in 2013 ! No one wants do that manually. If you really want then you should consider a few things :
first a loop that copies your file, at every iteration you copy buffer.length bytes. In you current code, you don't loop and copy 512 bytes of source image into dest (whatever the source image size is).
take care of last iteration and only copy what you read
your try/catch structure is not correct, you should add a finally close to always close your source and destination file. Look here for an example : what is the exact order of execution for try, catch and finally?
With IOUtils, it will give something like
try {
IOUtils.copy( source, dest );
} finally {
IOUtils.closeQuietly( source );
IOUtils.closeQuietly( dest );
}
and don't catch anything, it will be forwarded to the caller.