Need help saving HD video stream to file - android

I'm using libstreaming.
I would like to initiate two MediaCodecs with different settings and bitrates (one low quality - which will be transmitted via HTTP and one high to be saved to the SDCARD).
The problem appears to be that I can't grab two separate Mediacodec objects with differing settings.
The high bitrate version is saved as a video containing nothing but a green background *unless the dimensions are set to < 352x288, however the low bitrate version is successfully (and correctly) being streamed to the web.
I am really hoping that I'm doing something obviously wrong and that there's a simple way to save the HD version of the stream to disk.

In general, this should work on most devices - I do it without a problem on a number of devices.
But there are some devices where the encoder driver has got restrictions for this - in particular, some Intel devices refuse to create a second encoder instance while one is active. (Samsung Galaxy Note 3 10.1 comes to mind - not sure if all other Intel based ones have the same issue, or only some of them.)
Unfortunately, even if the Android CTS tests have tests to ensure that the hardware encoder works, there's no test that guarantees that you can have more than one encoder active at the same time.
Does your case fail only if you have differing settings on the second encoder, or also if they have the same settings?
If one stream is of a low resolution, you could try using a SW encoder for that instead, while using the HW encoder for the high resolution version. On Android 6.0, the SW encoder OMX.google.h264.encoder should be quite decent, while on older versions, it's close to unusable.

Related

Enable B frame encoding in MediaCode

All,
I am using MediaCodec class to generate video/avc video stream. stream encoding is fine but, I want to use B frames for better compression. Though I have set profile to AVCProfileHigh encoder is not generating B frames. Video stream has only I and P frames.
Below is media profile configuration.
mFormat.setInteger(MediaFormat.KEY_PROFILE, MediaCodecInfo.CodecProfileLevel.AVCProfileHigh);
Does MediaCoder support B Frames ??
If Yes, then hoe to configure the B frames.
This depends on the device. Android O and P briefly enabled B-frames automatically for AVC High profile encoding, but many apps started crashing as they did not expect out-of-order frames, so it got disabled shortly after launch. Also MediaMuxer failed on some stress streams with B-frames (e.g. if a B-frame is referring back from a frame over 1 second in the future). This is fixed in Android Q, though the back reference is still limited to 30 minutes.
Technically, apps can opt into getting B frames with MediaFormat.KEY_OUTPUT_REORDER_DEPTH format key (after setting high profile), but support for this in not required by devices, and is not hooked up in AOSP.
There is no guarantee that all devices will support it, but some devices might.
However, at least in earlier versions of Android, you had to set the level parameter at the same time, if you tried to set the profile parameter, otherwise it wouldn't be used. See https://stackoverflow.com/a/26293422/3115956 for more details about this. The thing with setting the level parameter is that you need to choose a level that is high enough to support the chosen resolution and frame rate.

AVC HW encoder with MediaCodec Surface reliability?

I'm working on a Android app that uses MediaCodec to encode H.264 video using the Surface method. I am targeting Android 5.0 and I've followed all the examples and samples from bigflake.com (I started working on this project two years ago, so I kind of went through all the gotchas and other issues).
All is working nice and well on Nexus 6 (which uses the Qualcomm hardware encoder for doing this), and I'm able to record flawlessly in real-time 1440p video with AAC audio, in a multitude of outputs (from MP4 local files, upto http streaming).
But when I try to use the app on a Sony Android TV (running Android 5.1) which uses a Mediatek chipset, all hell breaks loose even from the encoding level. To be more specific:
It's basically impossible to make the hardware encoder work properly (that is, "OMX.MTK.VIDEO.ENCODER.AVC"). With the most basic setup (which succeeds at MediaCodec's level), I will almsot never get output buffers out of it, only weird, spammy, logcat error messages stating that the driver has encountered errors each time a frame should be encoded, like this:
01-20 05:04:30.575 1096-10598/? E/venc_omx_lib: VENC_DrvInit failed(-1)!
01-20 05:04:30.575 1096-10598/? E/MtkOmxVenc: [ERROR] cannot set param
01-20 05:04:30.575 1096-10598/? E/MtkOmxVenc: [ERROR] EncSettingH264Enc fail
Sometimes, trying to configure it to encode at a 360 by 640 pixels resolution will succeed in making the encoder actually encode stuff, but the first problem I'll notice is that it will only create one keyframe, that is, the first video frame. After that, no more keyframes are ever, ever created, only P-frames. Ofcourse, the i-frame-interval was set to a decent value and is working with no issues on other devices. Needless to say, this makes it impossible to create seekable MP4 files, or any kind of streamable solution on top.
Most of the times, after releasing the encoder, logcat will start spamming endlessly with "Waiting for input frame to be released..." which basically requires a reboot of the device, since nothing will work from that point on anyway.
In the case where it doesn;t go havoc after a simple release(), no problem - the hardware encoder is making sure that it cannot be created a second time, and it falls back to the generic SOFTWARE avc google encoder. hich ofcourse is basically a mockup encoder which does little to nothing than spit out an error when trying to make it encode anything larger than 160p videos...
So, my question is: is there any hope of making this MediaCodec API actually work on such a device? My understanding was that there are some CTS tests performed by Google/manufacturers (in this case, Sony) that would allow a developer to actually think that an API is supported on a device which prouds itself as running Android 5.1. Am I missing something obvious here? Did anyone actually ever tried doing this (a simple MediaCodec video encoding test) and succeeded? It's really frustrating!
PS: it's worth mentioning that not even Sony provides yet a recording capability for this TV set, which many people are complaining anyway. So, my guess is that this sounds more like a Mediatek problem, but still, what exactly are the Android's CTS for in this case anyway?

Will all phones support YUV 420 (Semi) Planar color format in h.264 encoder?

Preambule: This may sound like a very specific question, but this is actually a go / no go to build an API 16+ Android application using MediaCodec that is compatible with most phone.
I have an application with a h.264 MediaCodec that receives data from a buffer - and not a surface since I'm doing a lot of manipulations on the image. When creating the Encoder, I iterate through the list of possible encoders from the phone to make sure I'm using a proprietary encoder if any. This part is not a problem.
The problem is that each encoder has its color format preference. This may lead to color format conversion before encoding. In my proof-of-concept, for example, I included method to convert to NV12, NV21 and YV12, along with following very strict rules, like placing the start of some planes / interleaved plane at a precise offset in the buffer, etc. Just being able to make an encoded video look good may be a long story.
So, if I was certain that there is a standard in what Android h.264 MediaCodec Encoders are accepting, then that would limit a lot the amount of customization I have to do to get my API 16 MediaCodec proof-of-concept working on all devices.
[EDIT]
Oh. I just saw that it's impossible to create an input surface before Android 18. I will have to rely on detecting proprietary codecs and color formats for each case, random crashes, slow FPS, etc. Bwe... With a bit of chance, in 2017 or 2018, there will be enough devices with relevant API features to write a decent app using MediaCodec.
The really short answer is: No.
From Android 4.3 there is a CTS test that verifies that devices support either 420 planar or 420 semiplanar input - and, most importantly, that it interprets it in the same way as the others.
In practice, most devices from Android 4.1 do support either planar or semiplanar in one form or another, but there's a lot of quirks you need to work around, which are specific to different devices and chipsets. Some encoders (a bunch of Samsung Exynos devices) advertise semiplanar but interpret it with chroma components swapped compared to other devices (and the reference). Some encoders (also Samsung Exynos) advertise planar but interpret it as a proprietary tiled format (or crash). Some encoders (Qualcomm) assume extra alignment between the luma and chroma planes.
See https://code.google.com/p/android/issues/detail?id=37769 for more details on the issues you can encounter. In practice you can probably get it working on a large number of devices, but you'll need to more or less verify it per device and enable different quirk workarounds per device.

Maximum number of simultaneous MediaRecorder instances on android?

I created android app that records device screen (using MediaProjection) API and video from camera at the same time. I use MediaRecorder in both cases. I need a way to find out whether device is actually capable of recording two video streams simultaneously. I assume there is some limit on number of streams that can be encoded simultaneously on given devices but I cannot find any API on android platform to query for that information.
Things I discovered so far:
Documentation for MediaRecorder.release() advises to release MediaRecorder as soon as possible as:
" Even if multiple instances of the same codec are supported, some performance degradation may be expected when unnecessary multiple instances are used at the same time."
This suggests that there's a limit on number of instances of the coded which directly limits number of MediaRecorders.
I've wrote testing code that creates MediaRecorders (configured to use MPEG4/H264) and starts them in a loop - On Nexus 5 it always fails with java.io.IOException: prepare failed when calling prepare() on 6th instance. This suggests you can have only 5 instances of MediaRecorder on Nexus5.
I'm not aware of anything you can query for this information, though it's possible something went into Lollipop that I didn't see.
There is a limit on the number of hardware codec instances that is largely dependent on hardware bandwidth. It's not a simple question of how many streams the device can handle -- some devices might be able to encode two 720p streams but not two 1080p streams.
On some devices the codec may fall back to a software implementation if it runs out of hardware resources. Things will work but will be considerably slower. (I've seen this for H.264 decoding, but I don't know if it also happens for encoding.)
I don't believe there is a minimum system requirement in CTS. It would be useful to know that all devices could, say, decode two 1080p streams and encode one 1080p simultaneously, so that a video editor could be made for all devices, but I don't know if such a thing has been added. (Some very inexpensive devices would struggle to meet that.)
I think it really depends on devices and ram capacity ... you could read the buffers for screen and cam as much as you like but only one read at a time not simultaneously I think to prevent concurrency but honestly I don't really know for sure

How to view an openGL renders generated from PC's c++ code on an Android Device connected via WiFi?

I'm working on an Augmented Reality (AR) demo in which high quality openGL renders (in C++) will be generated from a PC and then streamed to an Android display device (running minimum Android 2.2). What is the easiest way to achieve this in real-time (30 FPS on Android Device) ?
I've looked into existing Anrdroid applications, and have not found anything to be suitable so far. The best available were remote desktop applications (such as TeamViewer) however the frame rates were far too low and unreliable.
Possible solution A:
1) Encode openGL window as H.264 Video (natively supported by Android)
2) Stream the H.264 Video via RTSP using a server
3) View the content from an Android Browser (android and pc connected via WiFi)
Possible solution B:
1) Encode openGL window as IP Camera in c++ (is this possible?)
2) Use an IPCamViewer on Android device to view (again connected via WiFi)
I'm not entirely sure if either or both of these approaches are viable and would like some reassurance before moving forward.
What is the resolution of the image (is it equal to the current screen resolution, larger or smaller)? It is possible and efficient to transport a H.264 stream, but it also depends on the machine used to do the encoding. Hardware encoders or GPU-accelerated encoders are your best bet.
Remember - if you choose to go with encoding, you will have latency due to buffering (on the encode and the decode side). It will be a constant time offset so if that's not a problem you should be ok.
The total system delay as proposed by this paper is composed of
Note that none of these delays can be fully measured directly in frame-time. Some of these depend on the frame data and/or the encoder/processing performed. But as a rough estimate with fast GPU-encoding and hardware decoding I'd say a lag of around 5-20 frames. You'll have to measure the final latency per-scenario. I did this by sending frames containing text (ticks) and once the frames were steady, comparing them side-by-side. In fact, I even allowed the user to enter "test mode" at anytime to compensate for network traffic peak times or have him change the "quality" settings to tweak this delay.

Categories

Resources