Is there a way to play .mxf video on Android? - android

According to developer.android.com, the Android supports the playing of video using the H.263, H.264 AVC, MPEG4 SP and VP8 codecs. However, I want to play a video encoded in the .mxf format (Material eXchange Format, http://en.wikipedia.org/wiki/MXF) in my app. How do I go about it?

MXF generally contains MPEG-2 / AVC video with profile/levels not supported by the decoders in the android devices. Specifically IMX is 422 profile at main level and HD MPEG-2 MXF is generally 422 profile at high level.

The rule of the thumb says that you must use the most common and less resource intensive codec. Bad video playback is a deal breaker. Maybe you know the huge difference that format makes but the user won't.

Related

Android Media Codec type "video/mp4v-es" - Is it same as MPEG-4 Part 2 (MPEG-4 Visuals)?

I have less knowledge on the codecs. What I know codec stands for Decode/Encode.In codecs will be built in mobiles and external libs can used as an alternative. By codecs plays big role for Audio\Video in which format have encoded as file and decoded to play them.
Problem :
Android api 16 is shipped with MediaCodec which can do Encoding/Decoding work. MediaCodec contains flags constant
"video/mp4v-es"
Is it same as MPEG-4 part 2 (MPEG-4 Visual Format) codec format.
note : There is MPEG-4 part 10 format which is (H.264 )AVC Format. I just want need confirmation or any documentation or Blogs links which can help me in this.
Yes.
By default "video/mp4v-es" maps to the Google's MPEG4 Part-2 Video Software Codec. See media_codecs_google_video_xml for details. However on a real device, it will be implemented by a hardware video codec as software-video-codecs are processor-intensive.
For MPEG4 Part 10 (H.264), "video/avc" has to be used.
Its actually quite ambiguously defined but I believe that MP4V-ES is an MPEG-4 audio/visual stream which has been fragmented and mapped to RTP packets for transport using the RTP streaming protocol.
The RFC describing this outlines an efficient and pragmatic mapping of the audio and video packets to RTP packets - for example it does not simply assume that there is a one to one mapping.
More info is available in the RFC defining the format: https://www.rfc-editor.org/rfc/rfc6416

Android - Best solution to convert recorded H.264 Main Profile videos to H.264 Baseline Profile

I found MediaPlayer cannot play videos which are encoded by H.264 Main Profile and I tried ExoPlayer and Vitamio but none of them solved my problem. finally I found the best solution is converting videos to H.264 Baseline Profile. FFmpeg is almost 9MB and it's so heavy for my project, so I don't like to use it for converting videos to that profile by commands. My friend suggested converting videos on the server-side but we both know it has bad performance. What should I do? What is the best solution to this problem?
Android technically only supports H.264 Baseline, but many of the newer (usually high end devices) will play H.264 Main Profile, too. The Nexus 4,5,6,7 and 10 all do, for example. So, you have a few options... You either just use H.264 Main and don't care about older devices that don't support it, or you convert on the server side. Doing the conversion on the device is a bad idea. If it doesn't support H.264 Baseline, it was probably done for performance reasons and doing the conversion on the device and then decoding is going to crush the CPU.
Worth noting, ExoPlayer will use the same device codecs as MediaPlayer because it is just a wrapper around MediaCodec. Vitamio is a wrapper around ffmpeg and it might be possible to provide a H.264 Main codec with a custom ffmpeg build, but again, if it isn't there in the first place, performance was probably an issue.

H.264, 720p Video Bitrate

Disclaimer: I know very little about Video codecs & encoding.
I'm developing an iOS and Android app that allows users to record videos. I want to be able to upload the videos to YouTube & have them play at 720p quality.
The videos I'm recording will always be less than 180 seconds, always be ~30fps and will always have audio.
As far as I can tell, this means I need to record at a resolution of 1280x720, then I should be good. Is this correct?
I'm trying to determine how large, on average, an H.264 video file will be per second of video. From my understanding, I need to know the bitrate of the videos. What will the bitrate of recorded H.264 video be on Android 2.2+, and iOS 5+? This Android developer page mentions a bitrate of "2Mbps" for "HD" video - is that 2 Megabytes per second or 2 Megabits per second? Will that rate be the same for any recorded H.264 video?
Part of the reason I'm so confused about this is because I did a test with 4 different Android-encoded videos of different lengths, and produced the following output;
Wtf!?
Bonus points if you can link me to some iOS developer docs detailing this information - I've searched and can't find anything.
EDITS:
Possibly Related: H.264 file size for 1 hr of HD video
This wikipedia article mentions that the max bitrate for level 3.1 H.264 video (1280x720 # 30fps) is from 14000 - 17500kbps.
Yeah, 720p stands for 1280x720. I think it is correct.
To define how large your video file will be, you should record at a constant bitrate(CBR), but I doubt the camera will be using CBR while VBR(variable bitrate) is more efficient.
Mbps stands for Megabits per second.
I doubt that the rate will be the same as I stated earlier VBR could be used.
Edit:
Judging from the graph, it is definitely a VBR.

How to detect AAC audio profile and ensure Android compatibility

I am having problems figuring out how to detect if an AAC audio source is compatible with Android. The supported media formats page for android says 'AAC LC/LTP' when delivered as 3GP, MPEG4 or ADTS raw AAC. It appears the LC means 'Low Complexity" and LTP means "Long Term Prediction" but, my biggest frustration is determining what AAC profiles/modules are supported on Android. When I run the input into ffmpeg, i see its AAC, but no extended information about the AAC. An example source is http://6693.live.streamtheworld.com:80/WTMJAMAAC_SC . Anyone have any ideas?
You can get extended media information programmatically using the MediaInfo library available here:
http://mediainfo.sourceforge.net/en/Download
The "DLL" or other media downloads include example code in C, C#, etc.
If you do not want to write any code, the same website has downloads for "MediaInfo", a program that uses the library to display information.
Your Android supported media formats link says: "Mono/Stereo content in any combination of standard bit rates up to 160 kbps and sampling rates from 8 to 48kHz". Notice the sample below shows all of those: Channel(s), Overall bit rate, and sampling rate.
It may be necessary to test for yourself whether "up to 160 kbps" means "Up to 160 kbps overall" or "No part of the file, including those encoded with variable bit rates (VBR), may surpass 160kbps."
It is noteworthy that I have played movies on my single-core Android phone which have 256KBit VBR AAC 6-channel audio, though obviously I did not hear the rear surround channels. Because of those, I suspect that the limitations proposed in the link are minimums allowed by Google, but that the audio formats supported in practice are much more broad.
Here is an example from an actual AAC file (using the MediaInfo program):
Format : ADTS
Format/Info : Audio Data Transport Stream
File size : 176 KiB
Duration : 30s 707ms
Overall bit rate : 46.8 Kbps
Audio
Format : AAC
Format/Info : Advanced Audio Codec
Format version : Version 4
Format profile : LC
Format settings, SBR : Yes
Format settings, PS : Yes
Muxing mode : ADTS
Duration : 30s 707ms
Bit rate mode : Constant
Bit rate : 46.8 Kbps
Channel(s) : 2 channels
Sampling rate : 44.1 KHz
Stream size : 176 KiB (100%)
I wrote a wrapper library in C# for MediaInfo. It isn't necessary to use MediaInfo, but makes its use much easier and more ".NET-friendly". It can be found here: MediaInfo.Net.
Load the source into Media Player Classic.
View its propreties.
In the MediaInfo tab it would list:
Format : AAC
Format profile : LC
If you just want to check the profile used for a few files, you may use VLC or any other program (like Sheepy already suggested) - in VLC it's in Extras -> Media Information -> CodecDetails and in your example stream, it's AAC SBR+PS (this is a High-Efficiency Profile), which is decodable by android.
If you do have control over the media you want to play through android, you may want to check out this blog article on cross platform mobile media for the correct encoding. If not (e.g. because the user might be able to choose his own urls or files), you should instead catch any exceptions and display an error message. That way, you are also future proof against new media types, which might be supported in future android versions.

Multichannel Audio & Rec. 709 HD Standards

I had some more advanced questions which I couldn't find answers to. My company does both television production and Android development, so Google TV is a natural progression for us. We have a Logitech Revue for website development testing, however, we were unable to make Google IO and get onboard with "FishTank" for app development. I hope there are opportunities for other developers to get on board because we have some great ideas we are looking forward to implement!
Multichannel Audio Support
I notice there is decode support for DTS / AC3 (Dolby Digital) and encode support for AC3 (Dolby Digital) listed in Supported Media Formats. However, there is no information regarding more then 2 channels (Stereo) of audio. Would there be encode support for more then 2 Channels? Such as 5.1 or 7.1? Would these discretely accessible? Could we generate multichannel surround resource files in tools such as ProTools?
Rec. 709 HD Luminance Standards
Do our resources have to confirm to Rec. 709 Luminance Standards? Where reference black is 16 and reference white is 235? Or are the graphics luminance ranges remapped automatically? I also have the same question regarding encoded material in MPEG-2?
For more information on Rec. 709, check out this wikipedia entry.
Overscan and addressable space
I've read the Display Guidelines and was not clear on if Andriod binds you to the display dimensions set during the user screen measurement phase. If it does, could this be overridden? Most television content is already generated with overscan in mind.
Thanks so much! I'm really looking forward to developing!
Scott
I originally posted this on the Google TV Group, this is probably a more appropriate place.
AC3 Encoder:
The AC3 encoder is only available for mixing AC3 stream with Android audio. It maybe possible to play multi-channel audio through AudioFlinger because it does not explicit limit the number of tracks to two. We haven't tested it and engineering isn't hopeful. At the moment there is no way to play multi-channel audio other than embedding AC3 or DTS stream in a proper media container and play it on the MediaPlayer interface.
If it does work internally, it will depend on each OEM's implementation to actually get to the speakers.
Rec. 709:
On the Intel platform, the output is configured for Rec. 709 and the RGB graphics pixels are automatically converted from full RGB to Rec. 709. The color space meta data in the video stream is passed by the decoder to the renderer and converted to Rec. 709 automatically.
Graphics resources should be authored in full RGB space.
MPEG-2 video should be encoded in Rec. 709. If it is SD content, it should probably be left at its original Rec. 601 color space and let the hardware perform the conversion.
Overscan
Android does bind you to the display dimensions set durring the user screen measurement phase. It's not possible to override.

Categories

Resources