I am using the media muxer on Android to create an mp4. Basically I am feeding bitmaps to the muxer and encoding each bitmap as 1 second of video. Everything is working fine except that the last frame of video (the last bitmap) flashes at the very end of the video but doesn't also hold for 1 full second. This occurs when sharing to instagram, hangouts etc but if I pull the mp4 up on my mac it plays the full final second. Does anyone know what may be causing this? I am using this implementation:
How to encode Bitmaps into a video using MediaCodec?
Related
I have an audio and video stream and would like to generate an mp4. However, all the samples I've seen tend to generate an mp4 when it has the entire stream. What I want to accomplish is to encode the stream to mp4 while receiving the audio/video stream from the web.
It isn't clear whether an mp4 can be created on-the-fly or needs to wait until the final input data has been received. I believe that this has to be possible because when you record a video on your device using the camera, as soon as you stop recording, the video is already available. This means that it was encoding it on-the-fly. So it really shouldn't matter whether the frames come from the camera or from a video stream over the web. Or am I possibly missing something?
Without knowing how mp4 is actually generated, I'm guessing that even if the encoder receives a few seconds of frames and audio, it still can generate that portion of the mp4 and save it to disk. At any point in time, my app should be able to stop the streaming and the encoder just saves the final portion of the stream. What would be nice is that if the app were to crash during encoding, that at least the part that was encoded still gets saved to disk as a valid mp4. Not sure if that is even possible.
Ultimately I would like to do this using the MediaCodec, OpenGL ES and a Surface.
I have a working app that streams video to Chromecast(using nannoHttpd) and everything is working fine. Now my problem is: videos recorded using new devices are too large in size to stream, so I want to re-encode videos to some lower bitrate.
I tried ffmpeg but the results are not satisfactory and it will increase the apk size by 14 MB.
Now I am trying the MediaCodec api. It is faster than ffmpeg, but it takes the input file and writes it to the output file and I want to re-encode byte data that is to be served by nannohttpd.
Now a solution comes to my mind, that is to transcode the video and stream the output file but its has two drawbacks;
What if the file is too large and the user doesn't see the whole video? Much of CPU, battery resource is wasted.
What if the user fast forwards a long video to a time which is not re-encoded yet?
1 MediaCodec just do one thing decode encode! and you will get raw bytes of new encoded data. So it is up to the programmer to choose to either dump that into a container (.mp4 file) using a muxer. So no need here to rewrite everything back into a file.
2 Seek to the proper chunk of data and restart MediaCodec.
MY GOALS
1. Decode an MP4 Video.
2. Decode camera frames, edit with renderscript (apply effects)
3. Display the camera data inside an oval on top on the background video in Goal 1.
4. Encode the frames and use the MediaMuxer to save the video.
MY PROBLEM
I can successfully do goals 1-3 but I am stuck on goal 4. When I am using the second MediaCodec for encoding frames (1st MediaCodec is used in Goal 1), my whole app freezes and needs to be forced closed.
Can android actually handle two MediaCodecs simultaneously?
Can anyone offer any help on this please?
Thanks
I am using external camera with my application. The camera takes 9 pictures every second (9fps). The pictures are bitmaps 384x288. I need to create from this pictures a video file.
What I have tried:
Using Jcodec
The problem: jcodec is relatively slow, and for it to work properly i add the bitmaps to ArrayList and when the record stopped i convert the array to video. I takes to much time. For 30 sec video there is about 1 min rendering time.
Using native mediaCodec
The problem: I could only generate AVI files (video/avc) that not readable in the original android player. I can not use what is written here: http://bigflake.com/mediacodec/ because I developing for API 16. I have tried using (video/mp4v-es) but the video is corrupted and not playable in any player.
Using FFmpeg
The problem: Very complicated to implement in android, and I am not sure it will give me the result I needed after spending time to implement this. The result I need is to record video streaming as I get the bitmap without any delay.
What can you suggest me?
I know how to use ffmpeg to covert image sequence to a video.
What I want to do is start converting images to video, before I have all the images ready, i.e. as soon as I start to output images, ffmpeg starts conversion, and stops when the images stop coming. Is there any way to achieve this?
Edit : I'm trying this in Android.
If you want to store video on sdcard, you should start with FFMpegFrameRecorder class from OpenCV for Android. You can google it easily. It will allow you to add single frames and create a video bit-by-bit.
If you need to keep your video in memory, you will have to write your own frame recorder, which is not that trivial, but doable and I can help you a bit.