Android: video saved to gallery won't play - android

I've got a rather odd problem. I'm writing an Android application using the Xamarin framework, and I also have an iOS version of the same app also written in Xamarin. In the app the user can send photos and videos to their friends, and their friends may be on either iOS or Android. This all works fine, and videos taken on an iPhone can be played on an Android device and vice versa.
The problem I am having is when I try to programmatically save a video to the Android gallery, then that video is not able to be played in the gallery. It does appear that the video data it's self is actually copied, but the video is somehow not playable.
My videos are encoded to the mp4 format using the H.264 codec. I believe this is fully supported in Android, and like I said the videos play just fine when played via a VideoView in the app.
The code I am using to copy the videos to the gallery is below. Does anyone have any idea what I am doing wrong here?
public static void SaveVideoToGallery(Activity activity, String filePath) {
// get filename from path
int idx = filePath.LastIndexOf("/") + 1;
String name = filePath.Substring(idx, filePath.Length - idx);
// set in/out files
File inFile = new File(filePath);
File outDir = Android.OS.Environment.GetExternalStoragePublicDirectory(Android.OS.Environment.DirectoryMovies);
File outFile = new File(outDir, name);
// Make sure the Pictures directory exists.
outDir.Mkdirs();
// save the file to disc
InputStream iStream = new FileInputStream(inFile);
OutputStream oStream = new FileOutputStream(outFile);
byte[]data = new byte[iStream.Available()];
iStream.Read();
oStream.Write(data);
iStream.Close();
oStream.Close();
// Tell the media scanner about the new file so that it is
// immediately available to the user.
MediaScannerConnection.ScanFile(
activity.ApplicationContext,
new String[] { outFile.ToString() },
null,
null);
}
NOTE: I know this is all in C#, but keep in mind that all the Xamarin framework does is provide an API to the native Android methods. Everything I am using is either Java or Android backed classes/functions.
Thanks!

Your issue is in this code snippet:
byte[]data = new byte[iStream.Available()];
iStream.Read();
oStream.Write(data);
There are a few issues here:
You never read the files contents into the data buffer; iStream.Read() will only read a single byte and return it as an integer.
new byte[iStream.Available()] will only allocate the amount of data bytes that are available to be read without blocking. It isn't the full file. See the docs on the available method.
oStream.Write(data) writes out a garbage block of data as nothing is ever read into it.
The end result is the outputted video file is just a block of empty data hence why the gallery cannot use it.
Fix it reading in the data from the file stream and then writing them into the output file:
int bytes = 0;
byte[] data = new byte[1024];
while ((bytes = iStream.Read(data)) != -1)
{
oStream.Write (data, 0, bytes);
}
Full sample:
public static void SaveVideoToGallery(Activity activity, String filePath) {
// get filename from path
int idx = filePath.LastIndexOf("/") + 1;
String name = filePath.Substring(idx, filePath.Length - idx);
// set in/out files
File inFile = new File(filePath);
File outDir = Android.OS.Environment.GetExternalStoragePublicDirectory(Android.OS.Environment.DirectoryMovies);
File outFile = new File(outDir, name);
// Make sure the Pictures directory exists.
outDir.Mkdirs();
// save the file to disc
InputStream iStream = new FileInputStream(inFile);
OutputStream oStream = new FileOutputStream(outFile);
int bytes = 0;
byte[] data = new byte[1024];
while ((bytes = iStream.Read(data)) != -1)
{
oStream.Write (data, 0, bytes);
}
iStream.Close();
oStream.Close();
// Tell the media scanner about the new file so that it is
// immediately available to the user.
MediaScannerConnection.ScanFile(
activity.ApplicationContext,
new String[] { outFile.ToString() },
null,
null);
}

Related

External File reading and writing in Android API 30, in JAVA with MediaStore API

I have some problem with new google policy. I'm trying to find a way to read or write a txt file without MANAGE_EXTERNAL_STORAGE permission. The problem is:
The MediaStore API can be used to write or read files, but I can only read files created by the application later. In my case, an external software uploads the data to the phone using adb.exe via a USB connection and the app works with this data.
Because it was not created by the app, I get a “FileNotFoundException” “Permission denied” error message for the sample below. What kind of FLAG does the file that the application creates? Can I grant an external file this right?
if(Build.VERSION.SDK_INT>=Build.VERSION_CODES.R){
ContentValues contentValues= new ContentValues();
String datfile = null;
contentValues.put(MediaStore.Files.FileColumns.DISPLAY_NAME,"ac.dat");
contentValues.put(MediaStore.Files.FileColumns.MIME_TYPE,"");
contentValues.put(MediaStore.Files.FileColumns.RELATIVE_PATH, Environment.DIRECTORY_DOCUMENTS+dir);
Uri uri = context.getContentResolver().insert(MediaStore.Files.getContentUri("external"),contentValues);
InputStream inputStream = context.getContentResolver().openInputStream(uri);
int read = 0;
int bufferSize = 1024;
final byte[] buffers = new byte[bufferSize];
while ((read = inputStream.read(buffers)) != -1) {
datfile = new String(buffers);
}
inputStream.close();
}

Unable to convert URI string to Image file back in Android

I am developing an Android app. In my app, I am uploading multiple images to server using Retrofit network library. Before I uploading file I create a temporary file from bitmaps. Then delete them after uploaded.
photoFiles = new ArrayList<File>();
MultipartBody.Builder requestBodyBuilder = new MultipartBody.Builder().setType(MultipartBody.FORM);
int index = 0;
for(Bitmap bitmap: previewBitmaps)
{
File file = null;
try{
String fileName = String.valueOf(System.currentTimeMillis())+".jpeg";
file = new File(Environment.getExternalStorageDirectory(), fileName); // create temporary file start from here
if(file.exists())
{
file.delete();
}
OutputStream os = new BufferedOutputStream(new FileOutputStream(file));
bitmap.compress(Bitmap.CompressFormat.JPEG, 100, os);
os.close();
photoFiles.add(file);
requestBodyBuilder.addFormDataPart("files",file.getName(), RequestBody.create(MediaType.parse("image/png"),file));
}
catch (Exception e)
{
Toast.makeText(getBaseContext(),e.getMessage(),Toast.LENGTH_SHORT).show();
}
index++;
}
//Upload process goes here and delete files back after upload
Using above code, all working fine. But the problem is I have to create temporary files. I do not want to create temporary files. What I want to do is I create array list of Uri string when I pick up the file. Then on file upload, I will convert them to file back and do the upload process.
photoFiles = new ArrayList<File>();
MultipartBody.Builder requestBodyBuilder = new MultipartBody.Builder().setType(MultipartBody.FORM);
int index = 0;
for(Bitmap bitmap: previewBitmaps)
{
File file = null;
try{
Uri uri = Uri.parse(photosUriStrings.get(index));
file = new File(getPathFromUri(uri));
Toast.makeText(getBaseContext(),getPathFromUri(uri),Toast.LENGTH_SHORT).show();
photoFiles.add(file);
requestBodyBuilder.addFormDataPart("files",file.getName(), RequestBody.create(MediaType.parse("image/png"),file));
}
catch (Exception e)
{
Toast.makeText(getBaseContext(),e.getMessage(),Toast.LENGTH_SHORT).show();
}
index++;
}
As you can see in the above, I am converting the URI string back to file and then upload it. But this time retrofit unable to upload the file. File is not null as well. So I am pretty sure the error is with converting uri string back to image file back because my old code above working fine. Why can I not do that? How can I successfully convert from URI to image file back please?
I found this
Convert file: Uri to File in Android
and
Create File from Uri type android
both not working.
I am not clear about your question but I think this may help you. This single line code will help you to convert URI to file and show in your view.
Picasso.with(getContext()).load("URI path").into(holder.imgID);

Android-Compressing and Decompressing Video

I am trying to compress a video in Android before uploading. I am following a code from Stack, when I try it I can see that the file is compressed (not sure that it has actually compressed) but it reduces the file size and creates a compressed file as expected but I wont be able to open the file as its content is gibberish so I try to decompress the video as I can surely know that it has been successfully compressed which in turn has lead to decompression.
My problem is that the original file size and the decompressed file size is SAME, but the file does not open and it says "Sorry, the video cannot be played".
CODE :
Compression :
public static void compressData(byte[] data) throws Exception {
OutputStream out = new FileOutputStream(new
File("/storage/emulated/0/DCIM/Camera/compressed_video.mp4"));
Log.e("Original byte length: ", String.valueOf(data.length));
Deflater d = new Deflater();
DeflaterOutputStream dout = new DeflaterOutputStream(out, d);
dout.write(data);
dout.close();
Log.i("The Compressed Byte array is ", ""+data.length);
Log.e("Compressed byte length: ",
String.valueOf(dout.toString().getBytes().length));
}
Decompression :
public static void decompress() throws Exception {
InputStream in = new FileInputStream("/storage/emulated/0/DCIM/Camera/compressed_video.mp4");
InflaterInputStream ini = new InflaterInputStream(in);
ByteArrayOutputStream bout = new ByteArrayOutputStream(1024);
int b;
while ((b = ini.read()) != -1) {
bout.write(b);
}
ini.close();
bout.close();
String s = new String(bout.toByteArray());
System.out.println(s);
File decompressed_file = new File("/storage/emulated/0/DCIM/Camera/decompressed_video.mp4");
FileOutputStream out_file = new FileOutputStream(decompressed_file);
out_file.write(bout.toByteArray());
out_file.close();
Log.i("The Decompressed Byte array is ", ""+bout.toByteArray().length);
Log.e("De-compressed byte length: ",
String.valueOf(bout.toByteArray().length));
}
From the above code, the original byte length and the decompressed byte length is same but I am not sure why the byte array does not get write to the file. I can see that the two files of compressed_video and decompressed_video is created but I cant play either. Unable to play compressed_video.mp4 is acceptable but I should be able to play the decompressed_video.mp4 which is unavailable to play. I have been sitting on this for more than 2 days so any help would be insanely appreciated. Thanks in advance guys.

How can I properly send a Synced External Application Image to a RESTful WCF

I'm newbie to Android, I'm trying to send some Images from Android to a RESTFul WCF.
By now I'm being able to select the Images from the Gallery and sending them to the Server.
the WCF is expecting the image as Stream
But I'm having problems with the Synced Images that get stored in the Tablet Like the Facebook or G+ photos. (I don't know if they are cached or something)
I'm using this function to get the path of the Image
public static String getRealPathFromURI(Context context, Uri contentUri) {
String path = null;
if (contentUri.getScheme().toString().compareTo("content")==0)
{
String[] proj = { MediaStore.Images.Media.DATA };
Cursor cursor = context.getContentResolver().query(contentUri, proj, null, null, null);
int column_index = cursor.getColumnIndexOrThrow(MediaStore.Images.Media.DATA);
cursor.moveToFirst();
path = cursor.getString(column_index);
}
else
{
path = contentUri.getPath();
}
Log.i(TAG, path);
return path;
}
With that kind of images I get an internet path like:
https://fbcdn-sphotos-g-a.akamaihd.net/hphotos-ak-ash4/s2048x2048/432098_10151223392360790_398885469_n.jpg
Just for clarity and to remark. I get a "content" scheme.. so I get the path from the "if " something like:
content://com.sec.android.gallery3d.provider/picasa/item/5703464571893262194
To send it to the Server im using MultipartEntity, because I saw in others post here in SO to do so, like this:
((MultipartEntity) oInputEntity).addPart(
"fileContents",
new FileBody(new File(Utilities.getRealPathFromURI(context,
imageUri)),
"image/jpeg"));
With that kind of images I was getting a FileNotFoundEception I think it's because the image path is an Internet path, so the MultiPartEntity don't know how to retrieve it,
So I changed my method to download the image and now is working with this code
public static File getFileFromURI(Context context, Uri contentUri) {
String path = IntUtilities.getRealPathFromURI(context, contentUri);
Log.i(TAG, path);
final File file;
if (path.startsWith("http") || path.startsWith("/http") )
{
//if its an image form internet lets download it and save it in our directory and return that file
// Determine Uri of camera image to save.
final String fname = "BIR" + UUID.randomUUID() + ".jpg";
final File root = new File(Environment.getExternalStorageDirectory().getAbsolutePath() + File.separator + "BIR");
root.mkdirs();
file = new File(root, fname);
try {
final URL url = new URL(path);
final HttpURLConnection urlConnection = (HttpURLConnection) url
.openConnection();
urlConnection.setRequestMethod("GET");
urlConnection.setDoOutput(false);
urlConnection.connect();
final FileOutputStream fileOutput = new FileOutputStream(file);
final InputStream inputStream = urlConnection.getInputStream();
#SuppressWarnings("unused")
int downloadedSize = 0;
byte[] buffer = new byte[1024];
int bufferLength = 0;
while ((bufferLength = inputStream.read(buffer)) > 0) {
fileOutput.write(buffer, 0, bufferLength);
downloadedSize += bufferLength;
}
// close the output stream when done
fileOutput.close();
} catch (MalformedURLException e1) {
// TODO Auto-generated catch block
e1.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
else
{
file = new File(path);
}
return file;
}
((MultipartEntity) oInputEntity).addPart(
"fileContents",
new FileBody(Utilities.getFileFromURI(context,
imageUri),
"image/jpeg"));
But I'm not comfortable with this solution, seems like double effort, I turned off my Wifi and 3g in the tablet, also turned off and on the tablet iself and I still see those images, so I'm guessing they got copied locally or cached on the tablet when they were synced for the first time. I looked for them when attached to my computer (in Windows Explorer) to see if they were there, but I dont see them, maybe I'm doing something wrong or dont know the storage folder.
The main reason that I dont like this solution is that if you don't have Internet on the moment, obviously the image will not be downloaded, and the app I'm making is supposed to work offline, and well.. the Image is there, there shouldn't be a request to internet to guess a local image.
Being said this, is there a way to find the real/physical path of this Photos that were synced, that have an http or https scheme to send this images using the MultiPartEntity?
Or another proper way to send this Images to the Server?
I really appreciate your help
from the chooser dialog , you can always get a bitmap
chooser-->$Result->getData() = imageUri
from the imageUri, get a Bitmap by running the following code:
Bitmap bitmap = MediaStore.Images.Media.getBitmap(this.getContentResolver(), imageUri);
Once you get the bitmap..,
you can put it to a fileSink
you can use a Hashmap to cache it in Memory
mBitMap = BitmapFactory.decodeStream(
mCR.openInputStream(imageUri), null, options);
OutputStream os = new FileOutputStream(f);
mBitMap.compress(Bitmap.CompressFormat.JPG, minVal, os);
mload.memoryCache.putbig(file.toURI().toURL().toString(), mBitMap);
and you can http POST the Bitmap directly by loading its ByteArray to the Entity...
case POST:
HttpPost httpPost = new HttpPost(url);
ByteArrayOutputStream stream = new ByteArrayOutputStream();
float tmp = (float) 1024 * 1024 / bmp.getByteCount();
int minVal = (Math.round(tmp * 100) < 101) ? Math.round(tmp * 100): 100;
if (bmp.compress(Bitmap.CompressFormat.JPEG, minVal, stream)){
httpPost.setEntity(new ByteArrayEntity(stream.toByteArray()));
}else{ //TODO need to format actual message
handler.sendMessage(Message.obtain(handler,
HttpConnection.DID_ERROR, new Exception("ERR bitmap NG")));
}
background on the POST method is here
lazyloader project is a good template to use.
So why use mimeMultipartEntity with a file when you can operate directly on the bytes in the bitMap? As soon as you have a Uri, get a Bitmap and use the bitmap/ Uri pair for the basis of your interface to memCache, interface to HTTP POST, interface to fileSink used to retrieve local file when you have CacheMiss. This will help minimize doing everything on the basis of a file.
Think about using a Map [hashON(Uri.toString() :: bitMap] to store the images that you process locally. Then when you want to POST an image , you can just retrieve it from the map and POST its bytes directly in a "ByteArrayEntity".

how can i play video from byte in android

I have video in my project. and for security i encrypt the video files which is working quite well.
but problem is that the
**videoView.setVideoPath("/mnt/sdcard/intro_video.3gp");**
In this method I have to pass the file.(which is decrypted)
so I am creating decrypted file on sdcard for path of file is that possible to pass bytes (which are decrypted) directly in video view. I am using Cipher for encrypt.
Here is my code for
private void decryption()throws Exception {
// TODO Auto-generated method stub
String filePath2 = path + "en/encVideo";
String filePath3 = path + "de/decVideo";
File decfile = new File(filePath3);
if(!decfile.exists())
decfile.createNewFile();
File outfile = new File(filePath2);
int read;
FileInputStream encfis = new FileInputStream(outfile);
Cipher decipher = Cipher.getInstance("AES");
decipher.init(Cipher.DECRYPT_MODE, skey);
FileOutputStream decfos = new FileOutputStream(decfile);
CipherOutputStream cos = new CipherOutputStream(decfos,decipher);
while((read=encfis.read()) != -1)
{
cos.write(read);
cos.flush();
}
cos.close();
}
If streaming the video to a VideoView without an intermediary file to store the decrypted version is what you are looking for, then the answer is Yes you can do it. You need two main components: a streaming server such as a local http instance and CipherInputStream.
I doubt you can do it. Since you are using VideoView, it would require specific headers and tail ends that suggest which format and how it is encoded etc. If you can figure out that I still doubt it can take raw file. Your best bet would be to create random file names while saving and passing that to the player.

Categories

Resources