I am working on an Android application which is supposed to play videos over HTTP on Android devices. Before we setup a server to host the video files just wanted a few things clarified:
As per the developer documentation, Android supports .mp4 and .3gp container formats for video. If we use H.263(video) - AAC LC (Audio) audio-video codec used for our media files will we be able to play the video by passing the URL to MediaPlayer class?
I did a little experiment and passed URL of one of the video files(.mp4) to the MediaPlayer class and got the following error:
Command PLAYER_INIT completed with an
error or info
PVMFErrContentInvalidForProgressivePlayback
From the docs, I came to know that for progressive playback, the video's index (e.g moov atom) should be at the start of the file.
Questions:
1. How do we make our videos Android-ready?
2. What are the different considerations that we need to make?
Please help.
Thanks.
You can actually achieve this using a pure Java implementation of ISO BMF ( MP4 ) container used JCodec ( http://jcodec.org ). For this use the following code:
MovieBox movie = MP4Util.createRefMovie(new File("bad.mp4"));
new Flattern().flattern(movie, new File("good.mp4"));
The side effect of 'Flattern' is creating a web optimized movie file that has it's header BEFORE the data.
You can also use similar functionality from command line:
java -cp jcodec-0.1.3-uberjar.jar org.jcodec.movtool.WebOptimize <movie>
The JCodec library can be downloaded from a project website.
I cross posted this question on Android-developers google group. Mark answered it there. Thanks Mark!
See this thread
Related
I want to build an android app that records a voice and play reversed version of that.
I searched everywhere and there is these links:
First: that describes without any code that makes me confusing!
Second: with no answers until now!
Third: is a working code for swift
Forth: a working way in java. Not android!
Fifth: I'm not sure it is thr solution.
Sixth: I compiled it and changed it but it stops suddenly in recording.
Seventh: For swift there is AVFoundation.
Eighth: Not working.
Help me!!
You must record audio as raw with AudioRecord instead of MediaRecorder which uses encoder to compress and change the output. When you recorded PCM file, you can add 44Byte Header to it, to be converted to Wav format and be playable in devices.
If you want reverse it, should use a loop to read Bytes of it (if use PCM16 must use 2Byte) and after that add header and play it.
Good Luck.
I am trying to play a video on my TV using chromecasting framework provided. The approach i followed for playing a mp4 works fine. But now i have different source to play. I have a video file pointed by m3u8 file placed on my server.
So, for playing m3u8 file on TV i am using the following MediaInfo object with variants for content-type mentioned here.
The MediaObject i am returning is:`
return new MediaInfo.Builder(Uri.parse(path).toString())
.setStreamType(MediaInfo.STREAM_TYPE_LIVE)
.setContentType("videos/mp4") //need to know **what should be content-type here**
.setMetadata(movieMetadata)
.setStreamDuration(mSelectedMedia.getData().getDuration() * 1000)
.build();
Please guide me for playing m3u8 file on my chromecast or TV.
Thanks
First of all, I don't see what issue you are running into; always include that in your post so that you can get a better response. Based on the description you have provided, it seems like you are casting a playlist pointing at some files; if that is the case, you shouldn't set the stream type to live stream, instead use the buffered type (like what you would do for a simple mp4). Secondly, what receiver are you using? Your receiver should be capable of handling m3u8 playlist. If you use a Styled or the Default receiver (or use the Reference receiver from our GitHub repo, then you should be fine. Finally, make sure you are using https for the vide streams (for playlists it is required) and also that your server supports CORS headers.
As suggested by #naddaf problem was with CORS. So, i simply added my domains (gstatic.com and one more in my case) on server from where i am getting requests for my media. And it all started working perfectly.
I have the mov file url which I have to play using videoview. But android does not support that as per http://developer.android.com/guide/appendix/media-formats.html
So is there any way to play mov ulr video using video view or remultiplex (or re-encode, depending on the source) into an something that Android plays nice with, e.g. an mp4 container?
You can use ffmpeg to convert the file on the device if necessary, but integrating ffmpeg is not trivial and converting a video on the device is compute intensive so will take time and use up your battery. If it is possible to change the format server side, it is generally much easier.
If you do want to use ffmpeg, a wrapper approach my be useful - this project provides an example: https://github.com/jhotovy/android-ffmpeg. I have used a similar approach and it works fine.
Make sure in particular you note the comments about calling ffmpeg twice.
There are players in Google play that will play mov files, but not everyone has had good experience with them (for example: https://stackoverflow.com/a/27006587/334402). VLC in particular is a common choice and projects exist to integrate it into Android apps (although maybe not via web views) e.g.: https://github.com/mrmaffen/vlc-android-sdk
I'm working on an image processing project for the Parrot AR.drone, using opencv4Android, i'm so new to the whole thing! ,
does anyone have an idea about how to read in video streams from the ARDrone using OpenCV, the samples shows how to get video input from a webcam only
the video is encoded in H.264 format,and the drone adds a proprietary header (called PaVE) to every video frame, apparently that's why Android fails to load the video stream..
thanks
You need a PaVE parser that will strip the PaVE headers off the H.264 frames before you can decode them and feed them to OpenCV.
There are some PaVE parsers around. Maybe you can use one as-is, or adapt it for your use.
The official AR.Drone SDK (downloadable here: https://projects.ardrone.org/) includes C code for decoding PaVE; see the video_com_stage.c, video_stage_tcp.c, video_stage_decoder.c and video_stage_ffmpeg_decoder.c files in its ARDroneLib/Soft/lib/ardrone_tool/Video folder
Javascript (part of the node-ar-drone project): https://github.com/felixge/node-ar-drone/blob/master/lib/video/PaVEParser.js
C gstreamer module: https://projects.ardrone.org/boards/1/topics/show/4282
ROS drivers (by Willow Garage, who also created OpenCV): https://github.com/AutonomyLab/ardrone_autonomy
I have a task which involves integration of a video decoder into Stagefright(Android's multimedia framework). I searched and found the following about creating a new plugin for Stagefright:
To add support for a new format, you need to:
Develop a new Extractor class, if the container is not supported yet.
Develop a new Decoder class, that implements the interface needed by the StageFright core to read the data.
Associate the mime-type of the files to read to your new Decoder in the OMXCodec.cpp file, in the kDecoderInfo array.
static const CodecInfo kDecoderInfo[] = {
{MEDIA_MIMETYPE_AUDIO_AAC, "OMX.TI.AAC.decode"},
{MEDIA_MIMETYPE_AUDIO_AAC, "AACDecoder"},
};
The above data is all i could find out on net. Right now i have a simple app that will take a file as an input and render it on the screen using native API's in android. Can anyone please tell me how to proceed further. And from where does all these OMXCodec.cpp and others come into picture and which directory of my project should i have them in. Please provide solutions regarding the same. Thanks in advance.
From your question, it appears that you are looking at a recommendation which is specific for Ice-Cream Sandwich or earlier versions of Android. The first thing you should be clear about is the version of the android i.e. Ice-Cream Sandwich or before or JellyBean and after. The integration of codecs is different across different releases of Android.
I have already commented on your other question which is specific for JellyBean and later (Reference: Android: How to integrate a decoder to multimedia framework)
If you would like to integrate your codec in Ice-Cream Sandwich or before, the steps are already available in your question. In addition to adding the decoder into kDecoderInfo list, you may like to setup certain quirks as shown here.
For the question on OMXCodec.cpp, you can find this file at
frameworks/base/media/libstagefright/ in case of Ice-Cream Sandwich and frameworks/av/media/libstagefright/ in case of JellyBean.
If you have followed all the steps to integrate the video decoder into the Stagefright framework, then the easiest test would be to perform the following:
Copy a media file into SD-Card
In OMXCodec.cpp, enable logs by removing the comment in this statement //#define LOG_NDEBUG 0 and run a mm in the directory. Copy the rebuilt libstagefright.so to /system/lib on your device.
Enable logcat and start capturing logs.
Goto gallery, select your file and allow the standard player to play your file.
Check your log file if the player has selected your OMX component by searching for your component name. If found, your integration of codec into Stagefright is successful. Else, you will have to debug and find out what is the problem.
Postscript:
Based on your queries, I presume you aren't familiar with Android sources. Please refer to androidxref site to become familiar with AOSP distributions.
Unless you are planning to support a new media file-format, you will not require to support Extractor class. MediaExtractor abstracts a file-format parser and helps to de-multiplex the different tracks in a media file.
I hope with this information, you should be able to get your codec integrated and functional in Android.