Replace bytes in a file at specific index with specific length - android

In my app, I receive some files. At the beginning I just have the size of this file. So I create an empty file (filled of 0). After creating this file, I will receive 1024 bytes per seconds. Thoses bytes chunks correspond to file parts.
So I need to replace the current content of the file by the bytes I'm receiving.
This means I have to read/write the file every seconds. For small files, it's not a problem, but sometimes I'm having big files (>2Mo).
I searched but I couldn't find a way to replace a part of file at a given index without reading and reaching the while file everytime. Is there any simple solution and performance friendly?

After trying so much things with OuputStream, FileChannel, etc... and post this question. I finally found the "RandomAccessFile" class that solves my problem.
https://docs.oracle.com/javase/7/docs/api/java/io/RandomAccessFile.html

Related

How to add some data to .jpg or .mp4 file in android

Business Purpose :
1) Want to add large string(data) of length 1200 to the .jpg / .mp4 file in android mobile
2) Later the file can be uploaded to server from mobile
3) In server we retrieve the added data from the file
What i have tried in .jpg file :
Used the below code for adding data
ExifInterface exif = new ExifInterface(photoPath);
exif.setAttribute("UserComment", "String having length of 1000");
exif.saveAttributes();
This code is working. After i set the attribute, i can able to read it by
String userComment=exif.getAttribute("UserComment");
In low end mobile it showed error "stack corruption detected: aborted" while saving attribute.Later i found it taken up to 663 characters alone.
In high end mobile the string of length saved up to 1999 after saveAttribute().
Is there any other way to add some tag/meta data/string to .jpg,.mp4 and .mp3 file ?
So that the added data can be retrieved later.
please share your views. Is it possible ?
It sounds as if it's certainly is possible using your approach, but you're running into various implementation limits in how long attribute values are supported.
One solution to at least investigate is of course to split your 1200-byte string into multiple shorter strings, say four 300-byte ones, and add those as UserComment0, UserComment1 and so on. That should be trivial to extract and concatenate to get back your original longer string, and might work around the limitations.
Praveen,
take a look at Steganography project
https://github.com/johnkil/Steganography
Thanks,
Jey.

How to random read raw resource file in Android

In my Android project, I have a 2M-bytes raw data file. Since my application is a long-life app, I don't want it to always seize 2M memory. The data file has been formatted, once I need to some data from the data file, I just need to seek to some position and read several bytes.
The Resource class can only return an InputStream on raw file, but InputStream cannot do random read.
Is there a way on Android to random read some bytes from the raw data file? Or I have to read the entire file into memory when I only need a few bytes.
InputStream can skip bytes with skip() can also mark an offset with mark(), on reset() it can go back to marked position. All that can be used to do random IO.
You can store byte offsets in a separate lookup file as well.
Android is built upon Java so take a look at this tutorial:
http://docs.oracle.com/javase/tutorial/essential/io/rafs.html

How to get Page/Sheet Count of Word/Excel documents?

In my project I have one requirement to show the number of pages in Word documents (.doc, .docx) files and number of sheets in Excel documents (.xls, .xlsx). I have tried to read the .docx file using Docx4j but the performance is very poor but I need just the word count and tried using Apache POI. I am getting an error, something like:
"trouble writing output: Too many methods: 94086; max is 65536. By package:"
I want to know whether there is any paid/open source library available for android.
There is just no way to show exact number of pages in MS Word file, because it will be different for different users. The exact number depends on printer settings, paper settings, fonts, available images, etc.
Still, you can do the following for binary files:
open file use POIFSFileSystem or NPOIFSFileSystem
extract only FileInformationBlock as it is done in the constructor HWPFDocumentCore
create DocumentProperties using information from FileInformationBlock as it is done in constuctor of HWPFDocument
get value of property cPg of DOP: DocumentProperties::getCPg()
The description of this field is: "A signed integer value that specifies the last calculated or estimated count of pages in the main document, depending on the values of fExactCWords and fIncludeSubdocsInStats."
For DOCX/XLSX documents you will need to access the same (I assume) property but using SAX or StAX methods.

Write bitmap to file in chunks android

This question is related to my previous question but you dont need to read that in order to understand it.
Now I was trying to convert bitmap into smaller parts and then save those smaller parts.
Issue I get is, Only the first part gets saved in the file whose size is way bigger than the full image. Below is the code I am using:
for (int i = 0; i < Image.getHeight(); i++)
{
fout = new FileOutputStream(file, true);
Bitmap temp = Bitmap.createBitmap(Image, 0, i,Image.getWidth(), 1);
temp.compress(Bitmap.CompressFormat.PNG, 100, fout);
fout.flush();
fout.close();
}
The Code is pretty simple but i dont understand that why the only first row gets written in the file.
UPDATE::
Merlin and Deepak are right. I tried now with giving different names and all the parts were successfully written to different files. Now you know the problem, should i go for the removing of header from second chunk and removing of eof from first chunk or what?
I'm going to resist the urge to ask why on earth you are doing this as it is highly inefficient, so let's have a look.
So you are writing one line of pixels at a time but you are writing them to the same file repeatedly with the append flag set to true which is correct.
What you have missed is the fact that when you write a bitmap you are writing is self contained. So a program reading th first line will expect that to be the entire bitmap.
This is the equivalent of having an EOF marker in a text file. All the lines are being written but when reading it the reader gives up after the first EOF
You would need to research the the structure of a PNG file to understand more fully what is happening
Since you are appending compressed files (.png) one after the other, opening the resultant file will just show the first bit encoded data, which is your first row. This is logical too since the encoded image header has number of bytes comprising the encoded content and the decoders will not bother about the rest of the data in the file after end marker.
I just tried copying an .png file at the end of another, when i open the file i seen the unchanged first image!
Your logic is wrong because you cannot append each row as png to a file. Probably it has some header stuff, so they would be appended after each append.

Delete oldest File(s) in a directory until it is under a certain file size

What I'm trying to accomplish is the following: Suppose I have a function that writes an image to a File directory (either SD or internal cache). After writing out the File, I perform a check to see if my image directory is within a certain total file size (right now, I'm using this function to recursively calculate the directory's file size). If the file that I just added makes that directory too big, then what I want to do is keep deleting older files until we are just below that max file size.
I was thinking of first sorting the File directory members from oldest first (via comparator, ascending order using this example), then convert the array into an ArrayList to get its Iterator, then while our directory file size is still above the max file size, and I still have files to iterate to, I delete the older files until I break out of that while loop. Is there a more efficient way of accomplishing this?
Your bottleneck is most likely going to be file system operations (ie: reading the directory contents and deleting the files), not the in-memory manipulation, so you probably shouldn't worry too much about the efficiency of the latter so long as you don't do something grossly inefficient.
The rough algorithm you describe sounds fine. You can avoid the ArrayList conversion by simply doing something like:
for (Pair pair : pairs) {
if (totalSize <= maxSize) {
break;
}
totalSize -= pair.f.length();
pair.f.delete();
}

Categories

Resources