I am trying to create an app with the following features:
normal video playback
slower video playback
frame by frame
reverse video playback (normal, slower, frame by frame)
seekable to specific times
video scrubbing
no video sound needed
video is recorded via the device's camera
The closest comparison to an app, would be the Ubersense Coach app for iOS and Coach's Eye on Android, though there are a few others, and they have all these features.
I have looked into several choices so far:
First the built in Android Media Player, which can't really do anything I need.
Then the MediaExtractor/decoder, looking through the code for Grafika (https://github.com/google/grafika), which can't play backwards.
Tried to pull out each frame as needed with the MediaMetadataRetriever, which is too slow (100ms per frame) and the GC is in the way.
Looked for a library that could potentially solve the issue, without luck so far.
With MediaExtractor I already have the ability to play back video, forward frame by frame or full speed. But I do not have that luxury in reverse, and the seeking does take some time since I need it without artifacts.
When trying to go in reverse, even trying to seek to a previous sync and advancing to the frame, before the one I currently had, it is not doable without huge lag (as expected).
I am wondering if there is a better codec I could use, or a library I have yet to stumble upon. And would rather avoid having to create something custom in native code if possible.
Thanks in advance
Michael
Related
My android app plays videos in Exoplayer 2, and now I'd like to play a video backwards.
I searched around a lot and found only the idea to convert it to a gif and this from WeiChungChang.
Is there any more straight-forward solution? Another player or a library that implements this for me is probably too much to ask, but converting it to a reverse gif gave me a lot of memory problems and I don't know what to do with the WeiChungChang idea. Playing only mp4 in reverse would be enough tho.
Videos are frequently encoded such that the encoding for a given frame is dependent on one or more frames before it, and also sometimes dependent on one or more frames after it also.
In other words to create the frame correctly you may need to refer to one or more previous and one or more subsequent frames.
This allows a video encoder reduce file or transmission size by encoding fully the information for every reference frame, sometimes called I frames, but for the frames before and/or after the reference frames only storing the delta to the reference frames.
Playing a video backwards is not a common player function and the player would typically have to decode the video as usual (i.e. forwards) to get the frames and then play them in the reverse order.
You could extend ExoPlayer to do this yourself but it may be easier to manipulate the video on the server side if possible first - there exist tools which will reverse a video and then your players will be able to play it as normal, for example https://www.videoreverser.com, https://www.kapwing.com/tools/reverse-video etc
If you need to reverse it on the device for your use case, then you could use ffmpeg on the device to achieve this - see an example ffmpeg command to do this here:
https://video.stackexchange.com/a/17739
If you are using ffmpeg it is generally easiest to use via a wrapper on Android such as this one, which will also allow you test the command before you add it to your app:
https://github.com/WritingMinds/ffmpeg-android-java
Note that video manipulation is time and processor hungry so this may be slow and consume more battery than you want on your mobile device if the video is long.
I am trying to achieve smooth video scrubbing with Android VideoView. The seekTo method of MediaPlayer is not doing exactly what i want. It does not exactly seek to millisecond i passed in it, it actually plays from/jumps to the nearest position, not the exact I seeked to. Also the frames are showing with a large gaps. Not the exact frame for millisecond.
I came on searching around and found that SEEK_CLOSEST_SYNC can only seek to the nearest sync frame not the EXACT. It depends on the way the video was generated.
How can i achieve the smooth scrubbing and seek to exact position Even video is paused or playing. Is it possible through Android VideoView or MediaPlayer class or should i change the approach?
This is not a straight forward question to answer. I spent a month researching and implementing it myself.
The only way this can be achieved is using MediaCodec.
You can look at this project that does exact scrubbing using MediaCodec. The only issue I had with that project is that it takes time to load the buffer and display it on the surface, about 0.5 seconds.
You can also have a look at Grafica, especially the MoviePlayer class. Another source is the BigFlake website.
What I ended up doing is creating my own implementation using MediaCodec as none of them provided me exactly what I was looking for, as I was building a golf analyzer, I needed (real-time) precise (frame-per-frame) scrubbing of the video.
You can also look at a question I asked that is similar to this, where I added a lot of info on how I achieved this. Though this question is more about playing a video frame-per-frame, I think there is valuable info in it.
I work on a project that requires to have exact seeking of a video because the system needs to be synchronized to other devices. The OS uses for video playback is Android. So far I used the MediaPlayer class, but depending on the key frame amount, seeking is highly inaccurate.
So my next idea is to cache decoded images and wrap an own playback class around it. So far I understand how to use the MediaExtractor and the MediaCodec classes to decode videos manually. The class android.media.ImageReader seems to be exactly what I want.
But what I do not understand is how to render such an android.media.Image manually once I've got it? I'd like to prevent to do the YUV to RGB conversion manually, instead a preferred method would be to be able to put such an Image into a Surface or copy it to a SurfaceTexture somehow.
Please take a look here
In order to support use cases where need to sync videos being played on several devices this player makes exact seek
I'm working on an android app that plays video (using video view). the video is meant to have both music (left and right) and narration, but I want to selectively be able to turn off the narration track in the MediaPlayer.
Is the way to do this correctly to encode by mp4 video file with 3 audio tracks (right left and narration) and then turn off the naration audio track with deselectTrack()?
Not clear to me from the documentation that MediaPlayer can handle more than 2 audio tracks.
If the audio tracks are limited to 2, would it make sense to run two media player simultaneously (synching them up with seekTo())when I want the narration track to play?
Thanks.
Sorry to burst your bubble, but...
1) You have a misunderstanding about what a "track" denotes. A track can have multiple channels (e.g., a stereo track has left and right channels). As I understand it, stereo is the extent of the Android AudioTrack implementation at present. I haven't yet checked if the OpenSL implementation is more extensive than the Java API.
2) Only 1 audio track can be selected at a time, so you wouldn't be able to have background and narration simultaneously in the way you were thinking.
3) Audio tracks can only be selected in the prepared state (i.e., not after playback has started). The documentation mentions this limitation is not ideal, so it will probably change in the future. If not for this problem, your goal could be accomplished with two audio tracks encoded in the stream, one with both background & narration, the other just background.
You will probably find it difficult to synchronize two MediaPlayers, but I haven't tried. Maybe this approach would be acceptable for your situation, although be forewarned the seekTo method isn't accurate. It depends on the encoding of the files.
Something I would try if I were you is to have two complete encoded videos, one with narration, the other without. Use two MediaPlayers and keep them both prepared. When you want to switch use seekTo to put the correct one at (or near) the desired location. That way you don't have to worry about synchronization. If the video is large, this method could use significantly more resources, though.
What I am attempting to do is create an application that adds effects to videos while recording. Is there any way to have a callback method receive the frame, then apply an effect to it and have that recorded. There is currently an application on the Android Market (Videocam Illusion) that claims it is the only application that can do this. Anyone know how Videocam Illusion does this or have some links to possible tutorials outlining video processing for Android?
This is a similar question that is unanswered:
Android preview processing while video recording
Unfortunately, (unless I'm unaware of some other method provided by the API) the way this is done is using a direct stream to the camera and manipulating it by using some sort of Native Code to modify the stream. I've done something similar to this before when I was working on an eyetracker - So I'll tell you how it works basically.
Open a stream using the NDK (possibly api, depending on implementations)
Modify the bytes of the stream - each frame is sent as a separate packet. You have to grab each packet from the camera, and modify it. You can do a replace of colors, or you can translate. You can also use OpenGL to modify the image entirely by adding things like glass effects.
Flatten the images back out
send the image over to the view controller to be displayed.
One thing that you have to be mindful of is the load and send of the packets & images happen in about 1/30th of a second for each frame. So the code has to be extremely optimized.