I have an application Andoroid . I want to see more effects to video using photoshop file and preview it .acv . I have problems with the preview . Who has the solution to the problem by applying filters * .acv for video , thanks .
i used this
library and follwing is the command
String[]
complexCommand={"ffmpeg","-y","-i","file path of input video ","-strict","experimental","-vf","curves=psfile=filter acv file path ","-b","2097k","-vcodec","mpeg4","-ab","48000","-ac","2","-ar","22050","file path of the output video"}
for example
String[]
complexCommand={"ffmpeg","-y","-i","/storage/emulated/0/vk2/in.mp4","-strict","experimental","-vf","curves=psfile=/storage/emulated/0/videokit/sepia.acv","-b","2097k","-vcodec","mpeg4","-ab","48000","-ac","2","-ar","22050","/storage/emulated/0/videokit/out.mp4"}
given link is a forum you may ask your questions regarding ffmepeg there
Related
My project includes two modules, one main module with all the XML, Kotlin files, and images, and another with all the video files. I had to do this because my app has like 20 videos and it exceeds that 150 MB cap that is put on the app bundles. Anyway, I have am trying to play a video through a video view, and it worked before by just finding the filepath. Now, I'm not sure how to access the video in another module, and when I try to get the filepath it can't find it. I've already set up the dependencies, so I don't know what the issue is.
Here is an example of what I had before I moved the videos into the new module, and this worked:
val intent = Intent(requireActivity(), WatchActivity::class.java)
intent.putExtra("filePath", "android.resource://${requireActivity().packageName}/${R.raw.addeventvideo}")
requireActivity().startActivity(intent)
If you're wondering, that code is inside of an onClickListener for a button so that it opens a new activity with a videoview that plays the video with the filepath I pass into the intent.
Thanks for any help with this.
I have searched for a few hours now and can't seem to find an answer to my question(s). I have written the following lines of code in the android ndk(c++) and I am using the needed opencv libraries to accomplish the task.
void opening_images(){
Mat image ;
sillyString = "I have changed";
String imagePath = "//drawable//ring.png";
image = imread(imagePath,IMREAD_COLOR);
if(image.empty()){
sillyString = "Image not loaded";
}
else {
sillyString = "Image loaded";
}
}
I have tested this code in Qt with opencv and it works fine.At the moment the program in android studio returns the "Image not loaded" string. I think the main problem which is present is, the fact that I don't completely understand how to work with the file paths? In android studio I have included a picture under res/drawable/ring.png. I am able to view this image using the java side of the app.
Question 1: Is the specified imagePath = "//drawable//ring.png" correct to access the ring.png file ?
Question 2: Is there any permissions needed allowing the ndk to access res folders ?
Question 3: Is there any similar methods to assign an image to a Mat object?
Any help will be appreciated.
Edit:
If you take a look at how BitmapFactory decode resource works - you will see that getting bitmap from drawable still requires unpacking of a compressed image.
So answer to your q1: No it is not correct way to access ring.png, you will either have to download resource to your device or unpack it to byteArray and use imdecode instead of imread
Since this was my first time using Android Studio there was much to learn. It took me a while but here is the answers to the questions that I posted.
Question 1: Is the specified imagePath = "//drawable//ring.png" correct to access the ring.png file ?
This is most definitely not the correct path to use when accessing images for the purpose of image processing etc.The drawable folder can still be used to update ex. an image view by setting the src of the image view to the image
imageView.setImageResource(R.drawable.ring);
When working with images and Mat objects, I found it best to use the Android debugging bridge to copy the files to the SD card of the device. This link will provided you with the necessary steps to install the adb https://www.howtogeek.com/125769/how-to-install-and-use-abd-the-android-debug-bridge-utility/
When the images are done copying to the SD card. The file path can be found by making use of the built in Java functionEnvironment.getExternalStorageDirectory(). Which looks something like /storage/emulated/0/YOUR_file it depends on the location which was used to copy the files to.
Useful tip: Download ES File Explorer for the device to help navigate through the external or internal storage.
Question 2: Is there any permissions needed allowing the ndk to access res folders ?
The method which I used didn't need any permissions. However at the moment the NDK side cannot directly read an image from the SD card, the image must be passed from the Java side by making use of assets or by passing the address of the image which was read into the Mat object(Java side).
Read and write permission is needed in order to access the SD card. This must be set in the manifest.xml and must be correctly implemented in the code. There are many great tutorials on YouTube.
Question 3: Is there any similar methods to assign an image to a Mat object?
This question seems redundant now, there are many ways to skin a cat.
In short, I think it is easier to stick to the Java side when using Opencv4Android and some form of image processing is needed.To get you started in java here is a small snippet from my code.
Mat image;
String imageInSD = Environment.getExternalStorageDirectory()+"/Pictures/Data/Birds/"(ImageFolders[i])+"/"+String.valueOf(id)+".png";
image = Imgcodecs.imread(imageInSD,Imgcodecs.IMREAD_COLOR);
Good luck!!
Another way to use saved image in NDK is follows,
Then drawable folder, you can save it in assets folder. This helps to access multiple images also easily.
Then BitmapFactory.decodeStream helps to take it as bitmap and Utils.bitmapToMat is used to convert bitmap image to Mat file.
Then this Mat file, you can pass to NDK and process it using OpenCV C++.
Thanks
I am trying to read an image in my C++ code
LOGD("Loading image '%s' ...\n", (*inFile).c_str());;
Mat img = imread(*inFile, CV_LOAD_IMAGE_GRAYSCALE);
CV_Assert(img.data != 0);
and get the following output:
09-25 17:08:24.798: D/IRISREC(12120): Loading image '/data/data/com.example.irisrec/files/input/osoba1.jpg' ...
09-25 17:08:24.798: E/cv::error()(12120): OpenCV Error: Assertion failed (img.data != 0) in int wahet_main(int, char**), file jni/wahet.cpp, line 4208
The file exists. But strange is, that if I try to preview the image using Root File Browser it is just black. I copied the files there manually.
EDIT:
The code works fine under Windows with .png and .jpg format. I am just trying to port an existing C++ project for Iris Recognition to Android.
imread() determines the type of file based on its content not by the file extension. If the header of the file is corrupted, it makes sense that the method fails.
Here are a few things you could try:
Copy those images back to the computer and see if they can be opened by other apps. There's a chance that they are corrupted in the device;
Make sure there is a file at that location and that your user has permission to read it;
Test with types of images (jpg, png, tiff, bmp, ...);
For testing purposes it's always better to be more direct. Get rid of inFile:
Example:
Mat img = imread("/data/data/com.example.irisrec/files/input/osoba1.jpg", CV_LOAD_IMAGE_GRAYSCALE);
if (!img.data) {
// Print error message and quit
}
When debugging, first try to get more data on the problem.
It's an unfortunate design that imread() doesn't provide any error info. The docs just say that it'll fail "because of missing file, improper permissions, unsupported or invalid format".
Use the debugger to step into the code if you can. Can you tell where it fails?
Search for known problems, stackoverflow.com/search?q=imread, e.g. imread not working in OpenCV.
Then generate as many hypotheses as you can. For each one, think of a way to test it. E.g.
The image file is malformed (as #karlphillip offered). -- See if other software can open the file.
The image file is not a supported format. -- Verify the file format on your desktop. Test that desktop OpenCV can read it. Check the docs to verify the image formats that AndroidCV can read.
The image file is not at the expected path. -- Write code to test if there's a file at that path, and verify its length.
The image file does not have read permission. -- Write code to open the file for reading.
A problem with the imread() arguments. -- Try defaulting the second argument.
I was able to solve this issue only by copying the image files in code.I stored them in my asset folder first and copied them to internal storage following this example.
If someone can explain this to me please do this.
It could be a permission issue.You would have to request the permission from Java code in your Activity class like this in Android 6.0 or above. Also make sure that in your AndroidManifest.xml, you have the the following line :
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
In your activity file add this:
if (PermissionUtils.requestPermission(
this,
HOME_SCREEN_ACTIVITY,
Manifest.permission.READ_EXTERNAL_STORAGE)) {
Mat image = Imgcodecs.imread(filePath,Imgcodecs.IMREAD_COLOR);
}
I struggled a long time to find this and I was getting Mat object null for all the time before.
I'm using this code to save an image in the Android's Camera roll, unfortunately to date without any luck.
put mergStoragePath("pictures") into a
put a & "/myphoto.jpg" into pathFile
put the long id of image "myPhoto" into longIDofImage
export longIDofImage to file pathFile as JPEG
All suggestions will be appreciate.
I think the problem here is that the media scanner isn't scanning the file. I have a command coming to do that mergStorageScanFile. However, to do what you want to do there's actually a built in LiveCode command mobileExportImageToAlbum.
I need to capture frame by frame from a video stored in my sd card of the Android device (in this case my emulator). I am using Android and OpenCV through NDK. I pushed manually the file "SinglePerson.avi" inside the sdcard through file explorer of DDBS (eclipse) and I used the code below to read the file:
JNIEXPORT void JNICALL Java_org_opencv_samples_tutorial4_Sample4Mixed_VideoProcessing(JNIEnv*, jobject)
{
LOGI("INSIDE VideoProcessing ");
CvCapture* capture = cvCaptureFromAVI("/mnt/sdcard/SinglePerson.avi");
IplImage* img = 0;
if(!cvGrabFrame(capture)){ // capture a frame
LOGI("Inside the if");
printf("Could not grab a frame\n\7");
exit(0);
}
img=cvRetrieveFrame(capture);// retrieve the captured frame
cvReleaseCapture(&capture);
}
The problem is that cvGrabFrame(capture) results always false.
Any suggestion to correctly open the video and grab the frames?
Thanks in advance
Some versions of OpenCV (in package opencv2) build without video support. If it is your case you have to enable "-D WITH_FFMPEG=ON" in pkg's Makefile and recompile.
Look at "Displaying AVI Video using OpenCV" tutorial:
"You may need to ensure that ffmpeg has been successfully installed in order to allow video encoding and video decoding in different formats. Not having the ffmpeg functionality may cause problems when trying to run this simple example and produce a compilation errors".
Also check path in cvCaptureFromAVI for correctness.
Hope this will help!
The behavior you are observing is probably due to cvCaptureFromAVI() failing. You need to start coding safely and check the return of the calls you make:
CvCapture* capture = cvCaptureFromAVI("/mnt/sdcard/SinglePerson.avi");
if (!capture)
{
printf("!!! Failed to open video\n\7");
exit(0);
}
This function usually fails for 2 reasons:
When it's unable to access the file (due to wrong filesystem permissions);
Missing codecs on the system (or the video format is not supported by OpenCV).
If you are new to OpenCV, I suggest you test your OpenCV code on a desktop (PC) first.