How to write zero byte to file in Android? - android

I am creating an app that helps user to hide secret files like text file or photos. If user is going to delete the secret files, I want to make sure the files deleted is unrecoverable. I am trying to write zero byte or zero write the file before deleting. The problem is it doesn't zero write the file, it doesn't do anything. This is what I currently have.
public void zeroWriteDelete(File file) throws IOException{
FileInputStream fileInputStream = new FileInputStream(file);
FileOutputStream fileOutputStream1 = new FileOutputStream(file);
byte[] buffer = new byte[4 * 1024];
int read;
while((read = fileInputStream.read(buffer)) > 0){
Arrays.fill(buffer, (byte)0);
fileOutputStream1.write(buffer, 0, read);
}
fileOutputStream1.flush();
fileOutputStream1.close();
fileInputStream.close();
}
So, how do I zero write or overwrite the file with zero byte or data ? Or maybe other ways to make sure the deleted file is unrecoverable ?

You don't need to read from the file, or zero the buffer:
public void zeroWriteDelete(File file) throws IOException{
long length = file.length();
RandomAccessFile raf = new RandomAccessFile(file, "rw");
byte[] buffer = new byte[4 * 1024];
for (long i = 0; i < length; i += buffer.length) {
raf.write(buffer, 0, (int)Math.min(buffer.length, length-i));
}
raf.close();
file.delete(); // you forgot this rather vital part
}
E&OE

Related

How to upload big videos files to server quickly in android

Hi I am uploading Large video files to server using Volley Multi-part Api but it takes much time for upload to server
Is it better to split my video files and send to server? If it is better please provide me code how can I do that, If not what is the best way to uploading big videos files to server quickly?
To split file into parts (chunks):
public static List<File> splitFile(File f) throws IOException {
int partCounter = 1;
List<File> result = new ArrayList<>();
int sizeOfFiles = 1024 * 1024;// 1MB
byte[] buffer = new byte[sizeOfFiles]; // create a buffer of bytes sized as the one chunk size
BufferedInputStream bis = new BufferedInputStream(new FileInputStream(f));
String name = f.getName();
int tmp = 0;
while ((tmp = bis.read(buffer)) > 0) {
File newFile = new File(f.getParent(), name + "." + String.format("%03d", partCounter++)); // naming files as <inputFileName>.001, <inputFileName>.002, ...
FileOutputStream out = new FileOutputStream(newFile);
out.write(buffer, 0, tmp);//tmp is chunk size. Need it for the last chunk, which could be less then 1 mb.
result.add(newFile);
}
return result;
}
This method will split your file to chunks of size of 1MB (excluding the last chunk). After words you can send all these chunks too the server.
Also if you need to merge these files:
public static void mergeFiles(List<File> files, File into)
throws IOException {
BufferedOutputStream mergingStream = new BufferedOutputStream(new FileOutputStream(into))
for (File f : files) {
InputStream is = new FileInputStream(f);
Files.copy(is, mergingStream);
is.close();
}
mergingStream.close();
}
Just in case if your server is in Java also

Reading InputStream after writing

I am trying to read InputStream after writing output stream to sdcard. I have downloaded file from HttpURLConnection. File is successfuly written to sdcard. But I am trying to read inputstream from same file but contents are not being read properly. On emulator some contents are shown but on actual device contents are not shown. Can you please help what can be the issue? I am posting downloading, writing and reading code.
fileUrl = new URL(filename);
HttpURLConnection connection = (HttpURLConnection)fileUrl.openConnection();
InputStream is = connection.getInputStream();
/**
* Create file with input stream
*/
File downloadFile = new File("/sdcard/", "myFile3.pdf");
downloadFile.createNewFile();
final FileOutputStream outputStream = new FileOutputStream(downloadFile);
int availbleLength = is.available();
byte[] bytes = new byte[availbleLength];
int len1 = 0;
while ((len1 = is.read(bytes)) > 0) {
outputStream.write(bytes, 0, len1);
}
outputStream.flush();
outputStream.close();
File myFile = new File("/sdcard/myFile3.pdf");
InputStream inputStream = new FileInputStream(myFile);
byte[] buffer = new byte[inputStream.available()];
inputStream.read(buffer);
System.out.println("Byte Lenght: " + buffer.length);
inputStream.available() is only an estimate not actual length of the complete input data.
FileInputStream.available()

sending bmp usng sockets

I'm trying to send bmp image using socket. I have such code on android:
ByteArrayOutputStream stream = new ByteArrayOutputStream();
MainActivity.bmp.compress(Bitmap.CompressFormat.JPEG, 20,
stream);
byte[] byteArray = stream.toByteArray();
OutputStream os = echoSocket.getOutputStream();
os.write(byteArray,0,byteArray.length);
os.flush();
and on PC:
String q = SockIn.readLine();
File file = new File("filename.bmp");
FileWriter fw = new FileWriter(file.getAbsoluteFile());
BufferedWriter bw = new BufferedWriter(fw);
bw.write(q);
in bmp file I only get up to 401 bytes, which of course is corrupt bmp image. what am I doing wrong?
MODIFIED
modified PC side, now the code is:
InputStream in_ = clientSocket.getInputStream();
OutputStream out_ = new FileOutputStream("filename.bmp");
final byte[] buffer = new byte[1024];
int read = -1;
int i = 0;
while ((read = in_.read(buffer)) != -1) {
out_.write(buffer, 0, read);
System.out.println(i);
i++;
}
in_.close();
out_.close();
System.out.println("Done");
It never gets to last line( println("Done") ). when I close android program, it gets to last line and bmp opens succesfully
Your reading logic is completely off. You only use a readLine() once and then write that to file. The data that was written to the socket on the device side was binary. That means that trying to read it as if it were textual (as readLine() does) will return meaningless junk. The reason it's usually 401 bytes long is that readLine() will look for the first newline character combination and return everything up to that as a String. This is not what you want.
What you need is a loop that will read from the socket and write into the file as long as there is data in the socket. A standard copy loop should suffice here.
InputStream in = socket.getInputStream();
OutputStream out = new FileOutputStream(...);
final byte[] buffer = new byte[BUFFER_SIZE];
int read = -1;
while ((read = in.read(buffer)) != -1)
out.write(buffer, 0, read);
in.close();
out.close();
Note that the above code isn't tested but something to that effect should do the trick.
Why are you reading a String if you are sending a byte ?
Try those setp one by one only if the previous did not worked.
1. Read() and don't Readline() what you are writing
If you write a byte, read a byte
Byte obj = SockIn.read();
2. Encode your array before sending
Base64.encodeBase64String(byteArray);

Copying Xml File From Res/Xml Folder to Device Storage

I'm trying to copy an xml file from the res/xml folder to the device storage but I'm really struggling on how to do this.
I know that the starting point is to get an InputStream to read the xml file. This is achieved by using this:
InputStream is = getResources().openRawResource(R.xml.xmlfile);
Eventually the output stream will be:
file = new File("xmlfile.xml");
FileOutputStream fileOutputStream = new FileOutputStream(file);
But I'm really struggling on how to read and copy all the information from the initial xml file correctly and accurately.
So far, I've tried using various InputStream and OutputStream to read and write (DataInputStream, DataOutputStream, OutputStreamWriter, etc.) but I still didn't managed to get it correctly. There are some unknown characters (encoding issue?) in the produced xml file. Can anyone help me on this? Thanks!
From res/xml you can't you have to put all files in your assets folder then use below code
Resources r = getResources();
AssetManager assetManager = r.getAssets();
File f = new File(Environment.getExternalStorageDirectory(), "dummy.xml");
InputStream is = = assetManager.open("fileinAssestFolder.xml");
OutputStream os = new FileOutputStream(f, true);
final int buffer_size = 1024 * 1024;
try
{
byte[] bytes = new byte[buffer_size];
for (;;)
{
int count = is.read(bytes, 0, buffer_size);
if (count == -1)
break;
os.write(bytes, 0, count);
}
is.close();
os.close();
}
catch (Exception ex)
{
ex.printStackTrace();
}
I think you should use the raw folder instead. Have a look at http://developer.android.com/guide/topics/resources/providing-resources.html.
You can also use this code:
try {
InputStream input = getResources().openRawResource(R.raw.XZY);
OutputStream output = getApplicationContext().openFileOutput("xyz.mp3", Context.MODE_PRIVATE);
byte data[] = new byte[1024];
long total = 0;
int count;
while ((count = input.read(data)) != -1) {
total += count;
output.write(data, 0, count);
}
output.flush();
output.close();
input.close();
} catch (Exception e) {
}
And when you need file use this code:
File k =getApplicationContext().getFileStreamPath("xyz.mp3");

Android: Faster way to read/write a ZipInputStream?

I am currently writing an application that reads a zip file in my assets folder which contains a bunch of images. I am using the ZipInputStream API to read the contents and then writing each file to my: Environment.getExternalStorageDirectory() directory. I have everything working but the first time the application is run writing the images to the storage directory is INCREDIBLY slow. It takes about about 5 minutes to write my images to disc. My code looks like this:
ZipEntry ze = null;
ZipInputStream zin = new ZipInputStream(getAssets().open("myFile.zip"));
String location = getExternalStorageDirectory().getAbsolutePath() + "/test/images/";
//Loop through the zip file
while ((ze = zin.getNextEntry()) != null) {
File f = new File(location + ze.getName());
//Doesn't exist? Create to avoid FileNotFoundException
if(f.exists()) {
f.createNewFile();
}
FileOutputStream fout = new FileOutputStream(f);
//Read contents and then write to file
for (c = zin.read(); c != -1; c = zin.read()) {
fout.write(c);
}
}
fout.close();
zin.close();
The process of reading the contents of the particular entry and then writing to it is VERY slow. I am assuming it is more to do with reading than writing. I've read that you can use a byte[] array buffer to speed up the process but this does not seem to work! I tried this but it only read part of the file...
FileOutputStream fout = new FileOutputStream(f);
byte[] buffer = new byte[(int)ze.getSize()];
//Read contents and then write to file
for (c = zin.read(buffer); c != -1; c = zin.read(buffer)) {
fout.write(c);
}
}
When I do that I only get about 600-800 bytes written. Is there a way to speed this up?? Have I implemented the buffer array incorrectly??
I found a much better solution which implements the BuffererdOutputStream API. My solution looks like this:
byte[] buffer = new byte[2048];
FileOutputStream fout = new FileOutputStream(f);
BufferedOutputStream bos = new BufferedOutputStream(fout, buffer.length);
int size;
while ((size = zin.read(buffer, 0, buffer.length)) != -1) {
bos.write(buffer, 0, size);
}
//Close up shop..
bos.flush();
bos.close();
fout.flush();
fout.close();
zin.closeEntry();
I managed to increase my load time from anywhere from an average of about 5 minutes to about 5 (depending on how many images are in the package). Hope this helps!
Try use http://commons.apache.org/io/
like:
InputStream in = new URL( "http://jakarta.apache.org" ).openStream();
try {
System.out.println( IOUtils.toString( in ) );
} finally {
IOUtils.closeQuietly(in);
}

Categories

Resources