I want to develop video merge application in android. I am able to merge two videos but if the video size is more the device gets hanged .So I want to get frames of input video2(around 15 sec.) and merge it to video1. Any one have an idea how to get the frames from video. Is there direct native method available for that to get frames?.
Related
My android app plays videos in Exoplayer 2, and now I'd like to play a video backwards.
I searched around a lot and found only the idea to convert it to a gif and this from WeiChungChang.
Is there any more straight-forward solution? Another player or a library that implements this for me is probably too much to ask, but converting it to a reverse gif gave me a lot of memory problems and I don't know what to do with the WeiChungChang idea. Playing only mp4 in reverse would be enough tho.
Videos are frequently encoded such that the encoding for a given frame is dependent on one or more frames before it, and also sometimes dependent on one or more frames after it also.
In other words to create the frame correctly you may need to refer to one or more previous and one or more subsequent frames.
This allows a video encoder reduce file or transmission size by encoding fully the information for every reference frame, sometimes called I frames, but for the frames before and/or after the reference frames only storing the delta to the reference frames.
Playing a video backwards is not a common player function and the player would typically have to decode the video as usual (i.e. forwards) to get the frames and then play them in the reverse order.
You could extend ExoPlayer to do this yourself but it may be easier to manipulate the video on the server side if possible first - there exist tools which will reverse a video and then your players will be able to play it as normal, for example https://www.videoreverser.com, https://www.kapwing.com/tools/reverse-video etc
If you need to reverse it on the device for your use case, then you could use ffmpeg on the device to achieve this - see an example ffmpeg command to do this here:
https://video.stackexchange.com/a/17739
If you are using ffmpeg it is generally easiest to use via a wrapper on Android such as this one, which will also allow you test the command before you add it to your app:
https://github.com/WritingMinds/ffmpeg-android-java
Note that video manipulation is time and processor hungry so this may be slow and consume more battery than you want on your mobile device if the video is long.
i am developing an android app for my ip camera, and the camera has some specific api commands that it can respond to. the problem i am stuck on is that i want to display a list of videos available on the memory card of the camera. I am getting the file list but i also want to get the thumbnails of those files.
The problem in getting the thumbnail is that i don't have any direct IP address of the video, the camera only provides me two things for accessing the video
1. RTSP URL of the video
2. Data stream of the video, so that i can download it in my code.
Can someone tell me how can i get the thumbnail of the videos if i have the above mentioned available options?
Note: there's also one API available in the camera for providing the thumbnail of the video, when i send that command it returns me one frame of the video, currently it is sending me the corrupt frame and this method is not working, that's why i am focusing on getting the thumbnail from the other two available options.
any help will be highly appreciated.
Thanks
You could open a socket and stream a few seconds from each video, saving the file locally on your Android device.
Once you have it there, so long as it is a recognisable video format, you should be able to create a thumbnail in the usual way (http://developer.android.com/reference/android/provider/MediaStore.Video.Thumbnails.html).
You would need to be careful to make sure that your app actually did not try to play these truncated videos, and went instead to the proper stream URL if someone wanted to view them. You could actually deleted the file after creating the thumbnail if you wanted to be sure.
Doing this may take a little time initially if your camera has a lot of videos, but you should be able to set it up to only create thumbnails for new videos once it has run once which should speed things up.
It is also possible to create thumbnails from streams directly using ffmpeg or VLC (e.g. https://superuser.com/q/592160) but I think you may find the above approach is simpler for your needs and it avoids you having to integrate ffmpeg etc with your app.
I am using external camera with my application. The camera takes 9 pictures every second (9fps). The pictures are bitmaps 384x288. I need to create from this pictures a video file.
What I have tried:
Using Jcodec
The problem: jcodec is relatively slow, and for it to work properly i add the bitmaps to ArrayList and when the record stopped i convert the array to video. I takes to much time. For 30 sec video there is about 1 min rendering time.
Using native mediaCodec
The problem: I could only generate AVI files (video/avc) that not readable in the original android player. I can not use what is written here: http://bigflake.com/mediacodec/ because I developing for API 16. I have tried using (video/mp4v-es) but the video is corrupted and not playable in any player.
Using FFmpeg
The problem: Very complicated to implement in android, and I am not sure it will give me the result I needed after spending time to implement this. The result I need is to record video streaming as I get the bitmap without any delay.
What can you suggest me?
I'm making an app that takes a video and does some computation on the video. I need to carry out this computation on individual frames of the video. So, I have two questions:
Are videos in Android stored as a sequence of pictures? (I've seen a lot of Android devices that advertise having 25-30 fps cameras) If yes, can I, as a developer, get access to these frames that make up a video and how so?
If not, is there any way for me to generate at least 15-20 distinct frames per second from a video taken on an android device? (and of course, do the computation on those frames generated)
Videos are stored as videos. To manipulate frames one can use FFMPEG library. There are FFMPEG ports to Android such as in Dolphin opensource player. This would require C/C++ programming with NDK though.
I am developing Recording App that includes Pause/Play option.
I tried with both Media Recorder and AudioRecord
In case of AudioRecord , the recorded audio consumes larger size, so if the recording size increases say for eg: if i record 1 min audio it consumes 40 to 50MB an it really paining to combine by converting it to .raw file and send to php server.
So i tried with Media Recorder, it consumes less size,but not able to combine using the previous way handled in Audio Record.
Next step i tried with Android NDK- really paining for even Set up process.
Now my question is that which is the best way to combine recorded audio files
Using Android NDk
Reading the byte data from Audio and combining -If i use this there is problem with Headers of Recording format say amr,wav like that.
Also if i try with this , i am not able to get javax.sound package , So i tried with Plugins but no luck..
Please Suggest best way to do this. Also i tried with all this following links
Audio Link 1
Audio Link 2
Audio Link 3
Audio Link 4
Provide me Good tutorial or samples or links.Thanks.
For something like this your best bet would be to develop native C++ code using the NDK.