I am working on an app for android that creates video file from a video at start and then set of images, and saves it.
Is there any way to accomplish that?
I tried JCodec and it has broken libraries, untrusted code on the web and lack of knowledge about this library.
I tried FFMpeg and it is unsupported enough on android and involves working with NDK.
I tried to create an animation with AnimationDrawable and save this animation as a video, but I can't find a way to save animation as video except using the feature of KITKAT 4.4, but it requires connecting to a computer and having a root.
Is there any other solutions or a trusted and explained way to do this using the ways above?
Thank in advance
I would vote for FFMPEG. You don't need NDK or other sourcery if you can afford a prebuilt solution, like FFmpeg 4 Android.
Related
Is it possible to edit a video using QT/QML for Android and iOS?
I would like to be able to trim a video from timeX to timeY and add (if possible) a watermark.
I did some research but I didn't find anything good.
All Qt has to offer regarding video is contained in the Qt Multimedia library.
This library is not designed to do video editing, so you will have nothing out of the box.
However, it might be possible to combine QMediaRecorder and QMediaPlayer to trim a video. And you also have access to video frames: https://doc.qt.io/qt-5/videooverview.html#working-with-low-level-video-frames
I am not sure you will be able to do what you want using only Qt Multimedia, you might be better off using a dedicated video editing library. Maybe you can take a look at OpenShot, it is an open source video editor. The user interface is built with Qt and the video editing functions are in a separate library: libopenshot.
For days I am trying to find a working library that can decode the video stream of the Parrot AR Drone 2.0. The problem is actually that FFmpeg isn't working in Xamarin Android and the Xuggle-Xuggler is only for Java which makes it really difficult.
Furthermore, I tried to use FFmpeg, but everytime I got errors like this: DllImport error loading lbavcodec-55': 'dlopen failed: libavcodec-55" not found'. I have seen a lot of possible solutions but nothing works. I also tried to compile some .dll files which contains the FFmpeg source code, but unfortunately the same errors as before.
I just want create a TCP video stream to "192.168.1.1:5555". After that I want to use a possible decode class/library which could decode the bytes to frames or something like that and put it on the view using a VideoView, so the frames will be shown on the smartphone.
Has anyone experience with this? Or does someone know a working library for decoding the TCP video stream of the drone?
Thanks.
Good news, because I just solved the problem.
There is a possiblity to use FFMpeg, but you need to compile this specially for your platform. Actually this is in Windows a little bit harder than in Ununtu/Linux. However, I tried to implement a pre-compiled library into Xamarin Android, but there were errors like DllImport error loading lbavcodec-55': 'dlopen failed: libavcodec-55" not found', so that didn't work. Xuggle-Xuggler is a video decoder as well, but specially made for Java only and I am working in Xamarin Android, so I had to find something else.
After several weeks I saw a project which uses OpenCV. This could decode the video stream of the drone. However, there was this guy: https://github.com/AJRdev/ARDrone-Android-GEII who made the video stream in two different ways. Namely via OpenCV and a library called "Vitamio".
What I did was trying to use the Vitamio library which Xamarin Android supports. Because there is this Xamarin Android version known https://components.xamarin.com/gettingstarted/vitamiobinding, but that's an old version, so I decided to use the Vitamio library which can be found here: https://github.com/shaxxx/Xamarin.Vitamio. I am using this library because it's using .AAR which contains the same files as the Vitamio library in the project I was talking about before and the most important, no errors appeared :)
Unfortuantely there is no information on the internet about the Parrot AR Drone 2.0 using Xamarin Android. So, if there is someone with this problem, then you could use the source-code of the official app called "Freeflight 2.4", because that one is specially made for Android. However, there is a lot of code in the Freeflight 2.4 app which takes a lot of time to get the video stream part, but I did not have the time, so I chose for an easier way as I explained before.
After the implementation you should be able to see the video on your smartphone!
Good luck!
I am working on video editing in android. After alot of research and
development the only possible solution for real time video editing
found is FFMpeg (Other libraries like Vitamio just impose change
on video while running instead of changing the video). Want to find
soltuion where FFMpeg can easily integrate into android studio
project. Want to do Crop, trim, concatenation and other possible
process on video.
Any suggestion on how to encode a videoclip starting from a set of images in Android? I've tried JCodec but it is not documented and I was unable to use it. Any other alternative?
I think JCodec is properly documented. I had used it successfully in a project few days back. Download the sample app from the below link and you will get an idea.
JCodec android sample project
I'm looking to write an application that could combine images and then to form a video. I'm working with Phonegap framework to be available for use on Android and IOS.
My question is what sort of process is involved to achieve this?
At this stage I've tried to read about ffmpeg, most of the questions existing on stackoverflow talk of having to get the source, compiling to make a series of libraries for use. With those libraries it needs to be tied in with the Android/IOS libraries? (I notice there is an 'android.jar' with the project file in eclipse. Would it exist in there?) Afterwards my confusion lies with how is this implemented into Phonegap. Develop a plugin?
Just to add, libav according to wiki, has hardware accelerated H.264 decoding whilst using x.264 for encoding for Android. How does that work? Is this something accessed from libav libraries and then have to compiled in within the android.jar?
I may have confused terms in trying to describe what I do not know.
Any help would be appreciated.