Android-Compressing and Decompressing Video - android

I am trying to compress a video in Android before uploading. I am following a code from Stack, when I try it I can see that the file is compressed (not sure that it has actually compressed) but it reduces the file size and creates a compressed file as expected but I wont be able to open the file as its content is gibberish so I try to decompress the video as I can surely know that it has been successfully compressed which in turn has lead to decompression.
My problem is that the original file size and the decompressed file size is SAME, but the file does not open and it says "Sorry, the video cannot be played".
CODE :
Compression :
public static void compressData(byte[] data) throws Exception {
OutputStream out = new FileOutputStream(new
File("/storage/emulated/0/DCIM/Camera/compressed_video.mp4"));
Log.e("Original byte length: ", String.valueOf(data.length));
Deflater d = new Deflater();
DeflaterOutputStream dout = new DeflaterOutputStream(out, d);
dout.write(data);
dout.close();
Log.i("The Compressed Byte array is ", ""+data.length);
Log.e("Compressed byte length: ",
String.valueOf(dout.toString().getBytes().length));
}
Decompression :
public static void decompress() throws Exception {
InputStream in = new FileInputStream("/storage/emulated/0/DCIM/Camera/compressed_video.mp4");
InflaterInputStream ini = new InflaterInputStream(in);
ByteArrayOutputStream bout = new ByteArrayOutputStream(1024);
int b;
while ((b = ini.read()) != -1) {
bout.write(b);
}
ini.close();
bout.close();
String s = new String(bout.toByteArray());
System.out.println(s);
File decompressed_file = new File("/storage/emulated/0/DCIM/Camera/decompressed_video.mp4");
FileOutputStream out_file = new FileOutputStream(decompressed_file);
out_file.write(bout.toByteArray());
out_file.close();
Log.i("The Decompressed Byte array is ", ""+bout.toByteArray().length);
Log.e("De-compressed byte length: ",
String.valueOf(bout.toByteArray().length));
}
From the above code, the original byte length and the decompressed byte length is same but I am not sure why the byte array does not get write to the file. I can see that the two files of compressed_video and decompressed_video is created but I cant play either. Unable to play compressed_video.mp4 is acceptable but I should be able to play the decompressed_video.mp4 which is unavailable to play. I have been sitting on this for more than 2 days so any help would be insanely appreciated. Thanks in advance guys.

Related

Audio file to byte array Android not getting converted correctly

I am trying to convert audio file to the byte array, but it seems it is not getting converted correctly. I am recording sound using mic, then converting that file to byte array using file's path on the device.
The desired byte array should be like 0x12323
But it is coming like this string [B#14746f6
Below is the code to convert audio to byte array
file is the path of the file on the device. File type is amr
FileInputStream fis = new FileInputStream(file);
ByteArrayOutputStream out = new ByteArrayOutputStream();
int read = 0;
byte[] buffer = new byte[1024];
while (read != -1) {
read = fis.read(buffer);
if (read != -1)
out.write(buffer,0,read);
}
out.close();
byte[] bytes = out.toByteArray();
Log.e("byte array" ,bytes.toString());
String path= ""; // Audio File path
InputStream is= new FileInputStream(path);
byte[] arr= readByte(is);
Log.e("byte: ",""+ Arrays.toString(arr));
I solved this issue after talking to api guy. I converted byte array to base64 string and passed it. Which resolved the issue.

How to upload big videos files to server quickly in android

Hi I am uploading Large video files to server using Volley Multi-part Api but it takes much time for upload to server
Is it better to split my video files and send to server? If it is better please provide me code how can I do that, If not what is the best way to uploading big videos files to server quickly?
To split file into parts (chunks):
public static List<File> splitFile(File f) throws IOException {
int partCounter = 1;
List<File> result = new ArrayList<>();
int sizeOfFiles = 1024 * 1024;// 1MB
byte[] buffer = new byte[sizeOfFiles]; // create a buffer of bytes sized as the one chunk size
BufferedInputStream bis = new BufferedInputStream(new FileInputStream(f));
String name = f.getName();
int tmp = 0;
while ((tmp = bis.read(buffer)) > 0) {
File newFile = new File(f.getParent(), name + "." + String.format("%03d", partCounter++)); // naming files as <inputFileName>.001, <inputFileName>.002, ...
FileOutputStream out = new FileOutputStream(newFile);
out.write(buffer, 0, tmp);//tmp is chunk size. Need it for the last chunk, which could be less then 1 mb.
result.add(newFile);
}
return result;
}
This method will split your file to chunks of size of 1MB (excluding the last chunk). After words you can send all these chunks too the server.
Also if you need to merge these files:
public static void mergeFiles(List<File> files, File into)
throws IOException {
BufferedOutputStream mergingStream = new BufferedOutputStream(new FileOutputStream(into))
for (File f : files) {
InputStream is = new FileInputStream(f);
Files.copy(is, mergingStream);
is.close();
}
mergingStream.close();
}
Just in case if your server is in Java also

Sending video file in API

I need to make API in Json to send the video. I don't want to send the path of video. What is the best way to send the video in JSON which will be used by android and iPhone guys. If I use the base64 or byte[] then I am getting the memory exception error.
File file = new File("video.mp4");
FileInputStream fis = new FileInputStream(file);
ByteArrayOutputStream bos = new ByteArrayOutputStream();
byte[] buf = new byte[1024];
try {
for (int readNum; (readNum = fis.read(buf)) != -1;) {
bos.write(buf, 0, readNum); //no doubt here is 0
System.out.println("read " + readNum + " bytes,");
}
} catch (IOException ex) {
Logger.getLogger(genJpeg.class.getName()).log(Level.SEVERE, null, ex);
}
byte[] bytes = bos.toByteArray();
This is how you add a video byte by byte inside an byte array. You just then send the byte array as JSONOBject by following...
byte[] data; //array holding the video
String base64Encoded = DatatypeConverter.printBase64Binary(data); //You have encoded the array into String
now send that to server. (I am guessing you know how to)..
This is how you will decode your JSON to byteArray again.
byte[] base64Decoded = DatatypeConverter.parseBase64Binary(base64Encoded);
The typical way to send binary in json is to base64 encode it. Java provides different ways to Base64 encode and decode a byte[]. One of these is DatatypeConverter.
I hope it helps.
Cheers!
Edited:
You are getting OutOfMemoryException, because HeapMemory is 2Mb in size and your video is 2Mb, so when inserting into String, it's going out of memory. Even if you put it into an Object instances, you will EITHER have to re-initialize the heap or some way else. I will try to write an answer tomorrow. (Writing this half asleep, might be other way around_

Android: video saved to gallery won't play

I've got a rather odd problem. I'm writing an Android application using the Xamarin framework, and I also have an iOS version of the same app also written in Xamarin. In the app the user can send photos and videos to their friends, and their friends may be on either iOS or Android. This all works fine, and videos taken on an iPhone can be played on an Android device and vice versa.
The problem I am having is when I try to programmatically save a video to the Android gallery, then that video is not able to be played in the gallery. It does appear that the video data it's self is actually copied, but the video is somehow not playable.
My videos are encoded to the mp4 format using the H.264 codec. I believe this is fully supported in Android, and like I said the videos play just fine when played via a VideoView in the app.
The code I am using to copy the videos to the gallery is below. Does anyone have any idea what I am doing wrong here?
public static void SaveVideoToGallery(Activity activity, String filePath) {
// get filename from path
int idx = filePath.LastIndexOf("/") + 1;
String name = filePath.Substring(idx, filePath.Length - idx);
// set in/out files
File inFile = new File(filePath);
File outDir = Android.OS.Environment.GetExternalStoragePublicDirectory(Android.OS.Environment.DirectoryMovies);
File outFile = new File(outDir, name);
// Make sure the Pictures directory exists.
outDir.Mkdirs();
// save the file to disc
InputStream iStream = new FileInputStream(inFile);
OutputStream oStream = new FileOutputStream(outFile);
byte[]data = new byte[iStream.Available()];
iStream.Read();
oStream.Write(data);
iStream.Close();
oStream.Close();
// Tell the media scanner about the new file so that it is
// immediately available to the user.
MediaScannerConnection.ScanFile(
activity.ApplicationContext,
new String[] { outFile.ToString() },
null,
null);
}
NOTE: I know this is all in C#, but keep in mind that all the Xamarin framework does is provide an API to the native Android methods. Everything I am using is either Java or Android backed classes/functions.
Thanks!
Your issue is in this code snippet:
byte[]data = new byte[iStream.Available()];
iStream.Read();
oStream.Write(data);
There are a few issues here:
You never read the files contents into the data buffer; iStream.Read() will only read a single byte and return it as an integer.
new byte[iStream.Available()] will only allocate the amount of data bytes that are available to be read without blocking. It isn't the full file. See the docs on the available method.
oStream.Write(data) writes out a garbage block of data as nothing is ever read into it.
The end result is the outputted video file is just a block of empty data hence why the gallery cannot use it.
Fix it reading in the data from the file stream and then writing them into the output file:
int bytes = 0;
byte[] data = new byte[1024];
while ((bytes = iStream.Read(data)) != -1)
{
oStream.Write (data, 0, bytes);
}
Full sample:
public static void SaveVideoToGallery(Activity activity, String filePath) {
// get filename from path
int idx = filePath.LastIndexOf("/") + 1;
String name = filePath.Substring(idx, filePath.Length - idx);
// set in/out files
File inFile = new File(filePath);
File outDir = Android.OS.Environment.GetExternalStoragePublicDirectory(Android.OS.Environment.DirectoryMovies);
File outFile = new File(outDir, name);
// Make sure the Pictures directory exists.
outDir.Mkdirs();
// save the file to disc
InputStream iStream = new FileInputStream(inFile);
OutputStream oStream = new FileOutputStream(outFile);
int bytes = 0;
byte[] data = new byte[1024];
while ((bytes = iStream.Read(data)) != -1)
{
oStream.Write (data, 0, bytes);
}
iStream.Close();
oStream.Close();
// Tell the media scanner about the new file so that it is
// immediately available to the user.
MediaScannerConnection.ScanFile(
activity.ApplicationContext,
new String[] { outFile.ToString() },
null,
null);
}

Android picture has red tint after decoding and pulling the .png file from the emulator to pc

I am transferring some data from a server ( java app ) to client ( android app ).
The data gets Base64 encoded, sent, received correct, decoded ( correct ? ) and stored to the device ( correct ? )
I am using android studio and an AVD to simulate it. I take the pictures via DDMS from the virtual device folder to my computers harddisk in order to take a look at them. Is maybe there the problem?
now in the following code sections the picture files get decoded and stored to the device.
Cant figure out where the mistake is.
Would be glad about any hint.
byte[] imageBackToByt = Base64.decode(parts[9], Base64.DEFAULT);
Bitmap bitmapImage = BitmapFactory.decodeByteArray(imageBackToByt, 0, imageBackToByt.length);
File mediaStorageDir = new File(Environment.getExternalStorageDirectory()
+ "/Android/data/"
+ ctx.getApplicationContext().getPackageName()
+ "/Files");
File imageFile = new File(mediaStorageDir.getPath() + File.separator + voReceived.name + ".png");
try {
FileOutputStream fos = new FileOutputStream(imageFile);
bitmapImage.compress(Bitmap.CompressFormat.PNG, 90, fos);
fos.close();
} catch (FileNotFoundException e) {
Log.d(ctx.getString(R.string.SLDMP), "File not found: " + e.getMessage());
} catch (IOException e) {
Log.d(ctx.getString(R.string.SLDMP), "Error accessing file: " + e.getMessage());
}
This is how i encode it on the server in JAVA:
BufferedImage originalPicture = null;
ByteArrayOutputStream byteArrayOS = new ByteArrayOutputStream();
byte[] pictureInByte = null;
String pictureEncoded = null;
try {
// load the original picture from the filepath
originalPicture = ImageIO.read(picturFile);
// Convert the original picture from .png format to byte array (byte []) format
ImageIO.write(originalPicture, "jpg", byteArrayOS );
pictureInByte = byteArrayOS.toByteArray();
// Encode the byte array pictureInByte to String based on Base64 encoding
pictureEncoded = Base64.getEncoder().encodeToString(pictureInByte);
} catch (IOException e) {
e.printStackTrace();
// If picture failed to load / encode store string "PICTUREERROR" as an error code
pictureEncoded = "PICTUREERROR";
}
The server puts the bytes of the image file in a buffer and sends the contents of the base 64 encoded buffer to the client. Now on client side you should directly decode base 64 all bytes and write all the resulting bytes to file. In this way you have exactly the same file. All bytes are the same and file size would be equal too.
Instead you use BitmapFactory to construct a Bitmap and then compress it to PNG. That all makes no sense.
If you want to transfer a file then do not use BitmapFactory and Bitmap.
Having said that.. Mmmmm nice filter! The result is wonderfull!

Categories

Resources