I am working on an Android app, that needs to do the following:
- capture a (animated) view to video including audio (from a mp3 file)
- encode the captured video (probably a bunch of raw image buffers) and audio to avi.
After searching, FFMPEG seems the most suitable. Does anybody have a sample code to accomplish what I need. I would really appreciate.
Whyhow
It's not clear what you mean by 'a (animated) view' to capture, but be aware that android apps running with normal permissions cannot access the raw framebuffer. The computation part of ffmpeg builds in the ndk without undue work and there's a lot you can read about on the web, but the output (or in your case input) drivers are a bit of a permissions problem. Also you should expect encoding to be much slower than real time unless you can somehow manage to leverage hardware acceleration features of your particular device's SOC.
if u are building your app for android then u can use .avi writer code. You can get this code from "Koders website". Search for "Koders site" on google .you will get the link. I have tested the .avi file writer code and its working fine.
Related
Is it possible and if so "how" would one create a "fake" Camera in an Android application. By "fake" I mean an all software creation that simply looks like a regular Camera to the OS but in actuality takes a Bitmap or byte array as its input data. I want to use such a device with a MediaRecorder to create h.264 videos.
Things this could be used for:
Image slideshow video creation
Screen capture to video file
Caveats: No rooting and no ROM modification
I think what you are looking for is a way to encode videos to H.264 in a way similar to what MediaRecorder does but not from the camera. You do not particularly care whether this is done with a "fake camera" or in some other way, correct? In that case...
You can use the MediaCodec API available in Android 4.1 and later. You can just give it a series of images and it will create video encoded with (where available) the hardware encoder. Some sample code: Create video from screen grabs in android and Encoding H.264 from camera with Android MediaCodec
If you are expecting to affect other apps with your "fake Camera", that is only possible by modifying the Android source code and rolling your own ROM mod.
Yes,you can!
No rooting and no ROM modification,the best way to do this is to build a virtual app that runs the other app as a plugin,so that you can modify anything in the target app. But there is so much work to do, the best news is that there are several open source projects to do this.
And so, the next thing is not so difficult,you only have to hook several libs so in /system/lib that affect the camera recording.
In fact, I have done this on my device, but I modified the system lib directly, it has to be rooted of course. But it works well on almost all apps except some apps that use the service to capture video.
We have to modify the service lib, but it is a little more difficult.
I am creating an Android App for recording the Android screen (on a rooted phone) like the way Fraps/Camptasia does on a PC. Now i have a set of Bitmap images which i want to convert to a video format.
I have looked over the internet for ways to do so. there is a way to use FFMPEG, which involves building it for the android phone, but it is too complicated with JNI code,etc. Moreover I even tried looking up the mpeg standards so that i can make an encoder myself. But I can't find any information anywhere.
How do I go about doing this?
Digvijay
I've a requirement where I need to transcode small video clips shot from Native camera app to lower bitrate/resolution Mp4 which is shreable via email etc.
What is the best way to transcode/convert the video on device itself. FFMPEG or any other library?
p.s. I know this is an overkill for the device but client leaves me with no option. He doesn't care about battery or time it takes. I'm targeting this for quad-cores, where CPU is not a problem.
Your best bet would be to use something like ffmpeg which has been ported to Android (see this SO post: ffmpeg for a android (using tutorial: "ffmpeg and Android.mk") and the ffmpeg port for android which is here: http://bambuser.com/opensource). You'll have to use JNI etc, but that will save you the hassle of dealing with the byte stream yourself.
Haven't tried it on Android myself, so YMMV:
Is there a Java API for mp4 files?
http://code.google.com/p/mp4parser/
If you're recording on-device, why not set the expected format from your code? It appears the api lets you set video size, framerate etc. in the MediaRecorder class.
I know how to capture video on android device, but i would like to capture video and add some other information on it e.g. some funny timeclock and save it all to file so the person watching the video will see the exact time of capturing. I would also like to add some watermark.
Do you know how can i do it or is it possible on android device? I read the API but couldnt find anything that could help me.
I was being asked this question a short time ago, and as a backup we came up with some sort of backup plan: send your stuff to a server and let that (using ffmpeg?) do the watermark, save the file, and send a link back to the phone.. Maybe that's a route to take?
edit:
There seems to be an android port possible for FFMPEG. see for instance this link: http://gitorious.org/~olvaffe/ffmpeg/ffmpeg-android
I haven't had the time to compile it myself, but it seems you can either use the normal FFMPEG and the NDK, or use this version to compile for android. It's a bit more work, but looks do-able.
I actually don't think that's possible. You can fetch video frames from a camera preview, but there's no good way to encode them to video. The standard video encoder (MediaRecorder) can only record the actual direct camera input into a video file.
I know Android doesn't support MJPEG natively but are there any jar files/drivers available that can be added to a project to make it possible?
There is a View available to display MJPEG streams:
Android and MJPEG Topic
Hardly, unless it's your Android platform (i.e. you are the integrator of special-purpose devices running Android).
A good place to start looking on how the Android framework handles video streams is here:
http://opencore.net/files/opencore_framework_capabilities.pdf
If you want to cook up something entirely incompatible, I guess you could do that with the NDK, jam ffmpeg into there, and with a bit of luck (and a nightmare supporting different Android devices) you can have it working.
What is the root problem you are trying to solve, perhaps we could work something out.
You can of course write or port software to handle any documented video format, the problem is that you won't have the same degree of hardware optimized code as the built in video codecs, and won't have as efficient low-level access to the framebuffer. So your code is likely to not be able to play back at full speed. Sometimes that might be okay, if you just want to get a sense of something. Also mjpeg compresses frames individually, so it should be trivial to write something that just skips a lot of frames and only decodes whatever fraction of them it can keep up with.
I think that some people have managed to build ffmpeg or mplayer using the optional features of the cpus in some phones and get to full frame rate for some videos, but it's tricky and device-specific.
I'm probably stating the obvious here, but MJPEG consists simply of multiple JPEGs. If you can grab the frames by cutting out data, you can probably get that data to be displayed as any other image.
I couldn't find any information on when exactly this was implemented, but as of now (testing on Android 8) you can view MJPEG stream just fine using a WebView.