How to play video using FFMPEG library in Android? - android

I was able to build ffmpeg library by using rock player build script.
Now I have this .so file, how do I play video? And I want to display this video inside a small LinearLayout in my Activity.
Is it possible?
Update:
I know that it's easy to play video using VideoView or MediaPlayer + SurfaceView. I just wanted to understand more about ffmpeg library and how to display the frames inside an Android Activity.

Have a look at this player: https://github.com/bbcallen/ijkplayer
Basically what you need to do is build a JNI interface through to the MediaPlayer class (or possibly ExoPlayer in newer Android though I haven't done this yet).
If you look at the repo link you will see that this needs to be done on top of ffplay more than ffmpeg as the former is the player and the latter the decode/encode/package tool.

There are a few ways of doing this. You should have a look at Dolphin Player, an open source media player for Android. Its actually rather complex, you could also look at the VLC source code which makes use of FFMPEG but VLC is an extensive very complete wrapper to play videos.

See for example an Android app in github: https://github.com/havlenapetr/FFMpeg. This project may be somewhat outdated, but its part that is responsible for video display is quite understandable. You can look for more recent contributions on github.

Related

Edit video using QT and QML on Android and iOS

Is it possible to edit a video using QT/QML for Android and iOS?
I would like to be able to trim a video from timeX to timeY and add (if possible) a watermark.
I did some research but I didn't find anything good.
All Qt has to offer regarding video is contained in the Qt Multimedia library.
This library is not designed to do video editing, so you will have nothing out of the box.
However, it might be possible to combine QMediaRecorder and QMediaPlayer to trim a video. And you also have access to video frames: https://doc.qt.io/qt-5/videooverview.html#working-with-low-level-video-frames
I am not sure you will be able to do what you want using only Qt Multimedia, you might be better off using a dedicated video editing library. Maybe you can take a look at OpenShot, it is an open source video editor. The user interface is built with Qt and the video editing functions are in a separate library: libopenshot.

how to use FFMpeg in android with easy possible way? with Gradle in android studio for videos concatenation

I am working on video editing in android. After alot of research and
development the only possible solution for real time video editing
found is FFMpeg (Other libraries like Vitamio just impose change
on video while running instead of changing the video). Want to find
soltuion where FFMpeg can easily integrate into android studio
project. Want to do Crop, trim, concatenation and other possible
process on video.

Merge Image with Video in Android

Any one know any solution or library which helps to merge an image with video file? I researched some libraries as OpenCV, FFmpeg, but they seem not help my case.
I have successfully integrated Ffmpeg in one of my apps and added a watermark image on my video.
These links can be used to compile ffmpeg for Android.
Link 1
Link 2 (better and clearer approach)
Or, if you want to keep away from the hassle you can buy a paid wrapper by indie-developers (although i won't recommend it):
one such example is ffmpeg4android

Creating a video from set of images Android

I am working on an app for android that creates video file from a video at start and then set of images, and saves it.
Is there any way to accomplish that?
I tried JCodec and it has broken libraries, untrusted code on the web and lack of knowledge about this library.
I tried FFMpeg and it is unsupported enough on android and involves working with NDK.
I tried to create an animation with AnimationDrawable and save this animation as a video, but I can't find a way to save animation as video except using the feature of KITKAT 4.4, but it requires connecting to a computer and having a root.
Is there any other solutions or a trusted and explained way to do this using the ways above?
Thank in advance
I would vote for FFMPEG. You don't need NDK or other sourcery if you can afford a prebuilt solution, like FFmpeg 4 Android.

How do I use Android OpenCORE codecs using JNI?

I want to use the codecs in Android from my application. For now I just want to use the H.264 codec for testing, unless the mp3 or aac codecs provide functions for sending the audio to the device's speaker in which case I would prefer one of those.
I have the NDK installed along with Cygwin, GNU Make, and GNU Awk. I can't figure out what I need to do from here though. I'm downloading the entire OpenCORE tree right now but I don't even know how to build it or make Eclipse plugin know it needs to include the files.
An example or a tutorial would be much appreciated.
EDIT:
It looks like I can use JNI like P/Invoke which would mean I don't have to build the OpenCORE libraries myself. However, I can't find any documentation on the names of the libraries I need to load.
I'm also confused as to how to do it. I'm looking at http://www.koushikdutta.com/2009/01/jni-in-android-and-foreword-of-why-jni.html and I don't understand what the purpose of writing a library to access a library is. Couldn't you just use something like System.loadLibrary("opencore.so")?
You cannot build opencore seperately. It has to be built with whole source code. What are you trying to acheive. If you just want to play a video/audio, use VideoView or MediaPlayer object.
Build the Android source and use the headers and the static library from it. This will propel you straight to the zone of unsupported APIs.

Categories

Resources