How to trim the video using FFMPEG library in android? - android

In My Project,uploading the video ( duration is 10 seconds only) to amazon .So when i choose video from library of 30 seconds then i want trim the video to 10 seconds or may be less.I am using Mp4 Parser for trimming the videos but when trim the portrait video,it is converting to landscape mode and the video rotation also changes.I done lot of research in google but no solution lastly I found one solution that FFMPEG library.In windows the ffmpeg library is not building.So what can I do now?I am new to the FFMPEG library.Please give the suggestions how to build the FFMPEG library in android.Thanks in advance

Related

ffmpeg: Video duration confused for some players on Android

I am using ffmpeg to scale down and compress videos to be used in Android app.
The original files are played with no problem and have no metadata issues. When I re-encode them however, the Android player is able to play them, but displays wrong durations. My app is in production and it is using official player implementations, so I wonder if there is a way to fix the corruption using ffmpeg or adding some metadata to the generated files. Hopefully that's possible, but if not - any other possible fixes will be highly appreciated :)
My ffmpeg command is complex, including scaling and encoding but I can confirm this issue is reproducible on my end with the simplest ffmpeg -i video.mp4 -c:v libx264 videogen.mp4 command.
I'm leaving links to the two files if that's helpful for reference.
video.mp4
videogen.mp4
Any ideas what could be causing this and how to fix it?
Player issues showcase:
video.mp4 (original)
videogen.mp4 (re-encoded)
After some testing I found out that the player was playing fine files with the following metadata:
major_brand=mp42
encoder=Lavf58.24.101
So I used ffmpeg -movflags use_metadata_tags to set the working metadata :)

Set cover image while record audio in android

Hello I am recording audio using MediaRecorder and save it in MP4 format. and upload to the server. as well record video as MP4 using camera and upload to server.
So while I am receive my uploaded item from server as uploaded url list. and I am playing all it in VideoView. video and audio play good. now here is the thing while I am playing video it's display video frame on video view but while playing audio it's black. basically audio and video file extension are MP4 so I can't make different while adding it in to VideoView other wise I'll add imageview and display some default image and hide VideoView.
so is there any way to record audio with cover image so while I am playing it in Video view so it display that cover image...
I know that using ffmpeg we can do this but I got lots of error while compiling it in my windows pc. so is there any way to add cover image while record audio with pure android api? sorry for bad english.
I google everything but I can't find any solution for convert audio file to video file with using image. So I have to go with FFMPEG.
So I add WritingMinds java FFMPEG in my project using gradle dependency. Then convert MP4 audio with cover image into the MP4 video.
There are some limitation in this precompile library. like it not work in android 24 and some commands like -speed, cpu-used, -deadline which are most important for all the operation(But still we can perform other operation). there are some disadvantage also like slow speed, and some features which required for specific operation are not include in the compile library. (So I gave advice to compile your own ffmpeg library for android project in linux or mac pc(windowns have lots of issues)).
I hope this will help someone who's this type of scenario :)

How to trim video in android (without ffmpeg)?

Hi I am trying to trim video in android, but all of the source codes I have found are using ffmpeg, is there a smaller library, which I can use ?
Because ffmpeg library is about 8-9 MB, and my application is about 6 MB, adding ffmpeg library to my app will make it more than double size.
You can do this with mp4parser library. Have a look at the ShortenExample it does exactly what the name suggests. Since the library cannot re-encode the video it can only cut the video at I-frames. So the points in time where you can make a cut are quite coarse.
On Android 4.1 you can access the hardware codecs via MediaCodec API which could be an option (but I haven't seen any example of that yet)
Or, you can use this class: TrimVideoUtils.java

How to extract frames from video, process it, and combine it into video file in android

How to extract frames from video, process it, and combine it into video file.
Jcodec - very slow.
JavaCV - unsuitable, because it under GNU GPL.
OpenCV - because it uses OpenCV manager.
PS video from SD card, not camera.
This isn't really a question for StackOverflow, but I'll try to point you in the right direction.
I asked a similar question here: Recording Live OpenCV Processing on Android
That's specific for OpenCV, but the end result was using JavaCV and FFmpeg to create video files. That's agnostic of the input.
Here is the FFmpeg part. You'd just need to swap out the camera for extracting the frames from wherever you're getting them. FFmpeg can probably do that as well.

slow avcodec_decode_video2, ffmpeg on android

I am developing a player on android using ffmpeg. However, I found out that avcodec_decode_video2 very slow. Sometimes, it takes about 0.1, even 0.2 second to decode a frame from a video with 1920 × 1080 resolution.
How can I improve the speed of avcodec_decode_video2() ?
If your device has necessary hardware+firmware capabilities, you could use ffmpeg with libstagefright support.
Update: here is the easy procedure to decide whether it is worth while to switch to libstagefright on your device for a given class of videos: use ffmpeg on your PC to convert the representative video stream into mp4:
ffmpeg -i your_video -an -vcodec copy test.mp4
and try to open the resulting file with the stock video player on your device. If the video does play with reasonable quality, you can use libstagefright with ffmpeg to improve your player app. If you see "Cannot Play Video", your device hw+fw does not support the video.
That sounds about rite. HD video takes a lot of CPU. Some codecs may support multithread decode if you device has multiple cores. But the will consume massive amounts a battery, and heat the device. This is why most mobile devices use specialized hardware decoders instead of CPU. In Android using the MediaCodec API instead of libavcodec should invoke the hardware decoder.

Categories

Resources