I have enabled stats for nerds option in YouTube android app and played the same video in "Vivo V9" and "Nexus 5" device.
Vivo V9 : It played the video in WebM format which is basically "VP8 or VP9" codec.
Nexus 5 : It played the video in MP4 format which is basically "H264 or H265" codec.
So, based on the device YouTube app selects video codec.
Question : How does it do ? I know internally it uses ExoPlayer for playing video but ExoPlayer by default doesn't give functionality.
Different codecs may require different licences, which can cost much. Moreover codecs can be both software and hardware. There is no problem with many SW codecs but as name states HW codecs require specific chip which also increases cost and occupy space. That's why there is big variety between device manufactures and even between two devices from the same manufacturer. They just simply want to cut costs.
It's very common that one device has only a subset of most popular codecs and even if it has let's say both VP8 and H264 then one of them can be HW codec while other can be SW codec in which case usually HW codec will be preferred. YouTube (here I mean youtube.com not YouTube app) serves videos in different formats, so the device can choose optimal codec for it's capabilities.
Now as to choosing the right codec YouTube app can use MediaCodec API from Android. Please check e.g. this and / or it even can provide it's own SW codecs, so I would say that the behavior is platform dependent.
Last things are corner cases e.g. something can be played / recorded in background e.g. camera is turned on and screen recording happens while using YouTube app. Again it depends on device but HW codecs have limitations to the number of media codec instances and in such corner case it's possible that even if device has some HW codec YouTube app may be forced to choosing SW codec.
Edit:
Starting from Android Q there is also new API which lets you to easily differentiate between SW and HW codecs. Take a look at this.
Related
I am using Opentok SDK for video calling in IOS and Android devices with Nodejs server.
It is a group call scenario with max 4 people, when we stream for more than 10 min, both the devices getting too hot.
Does anyone have solution for this?
We can't degrade the video quality.
This is likely because you are using the default video code, VP8, which is not hardware accelerated. You can change the codec per publisher to either H.264 or VP8, but there are some trade-offs to this approach.
Their lack of H.264 SVC support is disappointing, but might be okay depending on your use case. If you read this whole post and still want more guidance, I'd recommend reaching out to their developer support team, and/or post more about your use case here.
Here's some more context from the OpenTok Documentation, but I recommend you read the whole page to understand where you need to make compromises:
The VP8 real-time video codec is a software codec. It can work well at lower bitrates and is a mature video codec in the context of WebRTC. As a software codec it can be instantiated as many times as is needed by the application within the limits of memory and CPU. The VP8 codec supports the OpenTok Scalable Video feature, which means it works well in large sessions with supported browsers and devices.
The H.264 real-time video codec is available in both hardware and software forms depending on the device. It is a relatively new codec in the context of WebRTC although it has a long history for streaming movies and video clips over the internet. Hardware codec support means that the core CPU of the device doesn’t have to work as hard to process the video, resulting in reduced CPU load. The number of hardware instances is device-dependent with iOS having the best support. Given that H.264 is a new codec for WebRTC and each device may have a different implementation, the quality can vary. As such, H.264 may not perform as well at lower bit-rates when compared to VP8. H.264 is not well suited to large sessions since it does not support the OpenTok Scalable Video feature.
The three Android devices that I am testing out have three different AVC packets. Here below are the samples for Samsung, Motorola, and Doffe.
Samsung
1700000000014d001effe10012674d001eda0280bfe5948283030368509a8001000468ee0
Motorola
1700000000014d001effe10012674d001ee901405ff2ca41418181b4284d4001000468ee06e2
Doofe
170000000001640029ffe1001067640029ac1b1a80a02ff9601e1108a701000468ea43cb
This causes huge problems when I interleave the videos. The Video player gets obviously confused and does not play.
How do I ensure that the video headers are the same? Should I use a software encoder and bypass the hardware encoders?
How is carried out encoding on the clients?
PPS SPS describe your video stream parameters like frame sizes, profile, etc. and almost always are generated by encoder.
How do I ensure that the video headers are the same?
Therefore, you have to ensure all devices use the same video encoder and publish video with the same format (frame size, bitrate, fps, profile, key frame count, etc)
Should I use a software encoder and bypass the hardware encoders?
In your case, preferentially to use soft encoders. But you first just can try to make encoders configurations the same. Most likely it resolve your issue
I have application for sharing videos. I found an issues in HTC Wild fire device, Video is not showing the content which taken by Nexus 7, But i can hear the voice in that video.
I guess it happen because streams encoded with unsupported codecs or unsupported video size. My question Which encoding format supported by all the android devices powered by android API 8+
Please refer to the list of supported media formats as captured by Google at http://developer.android.com/guide/appendix/media-formats.html
Any device vendor can choose to not support a specific encoding format. If you can connect to your device, please pull /etc/media_codecs.xml which provides the list of encoders and decoders supported by the device.
H.264, H.263 & MPEG4 are pretty well known formats and almost all android devices support the playback of the same, whereas VP8 is relatively newer format and hence, from your reported problem, I suspect that your device may not be supporting the playback of the same.
According to Supported Media Formats this formats support built into the Android platform (and supported in almost all devices even on Android 2.2 or older):
Video:
1. H.264 AVC (Baseline profile) inside MPEG-4 (mp4) or 3GPP container
H.263 inside MPEG-4 or 3GPP container
MPEG-4 SP inside 3GPP container (3gp)
Audio: AAC LC,AAC HE, mp3.
But you have to be patient with bitrates and video resolution. Some devices cannot handle resolution more than 720p (like H.264 with CABAC on Tegra 2). Most of devices support FPS up to 30. Some devices (old Sony Xperia) has issues with decoding surround sound audio in some formats which will cause silence.
In my app I need to play videos from sdcard. Now it works fine on Galaxy S,Galaxy Tab2, But on some chinese tab Like "Giada" It is not working at all.
I have 4 different activities to play videos like. First activity plays a menu video which has navigation link to other activity. Problems I am facing.
First Video plays properly but looping failed and app closed.
If I navigate to other activity to play another video it says "Can't Play Video" and closed Some time it plays same video but not complete and closed app in between.
Video Extension: MP4
Resolution : 1024x600
Playing From : SDCard.
Target Tab Specification.
Resolution : 1024x600
Android :4.1
Tried with Video View and SurfaceView.
Help me out any help will be regreted.
The answer to this question will never be consistent across all devices or across all videos.
Whether a given video file will play in a given player depends on three things:
The video container format (file type).
The codecs the video (and potentially audio) streams are encoded with
Your player's support for that combination of container format and codec
The codec and player/device support for it is almost certainly the cause of the inconsistent results you've seen. (A codec, if you didn't know, is basically a repeatable mathematical formula that tells your system how to turn bits and bytes packed into a file into moving pictures(and back again, for that matter))
There are a large variety of video codecs in the video files floating around out there. Support for these codecs is wildly inconsistent just due to the history of video distribution. Many devices won't support streams encoded with certain codecs. There are a variety of reasons for this, but the most common are obscurity or licensing costs.
For example, up until a few years ago, almost everything was encoded in an .FLV container with an On2 VP6/VP7/VP8 codec. This is causing headaches today because while On2 owned these codecs, they kept a tight rein on the licenses. That didn't relax until .FLV had already begun to lose relevance, and so there is not a whole lot of (legitimate) software out there that can work with On2-encoded content.
What all of this means is that there is no silver bullet. All video will never run on all devices, at least not without the aid of video players that install and use their own codecs to decode the streams.
Needless to say, this does not include the libraries provided to you and your end users by the factory-installed Android libraries.
So, what do you do? Well, short of producing a video player that carries its own codecs, you can most effectively address the problem with a two-step approach:
Target specific devices that you want your application to work on
Encode your content to use use a video codec that works on all the devices you want to target. You may need to produce two copies of your video if you find that there is no codec that works across all devices you plan to support.
Today, the widest support is available with an MP4 container and a video stream encoded with the H.264 (AVC) codec. As I said, there is no silver bullet, and H.264 support is not universal by any means, but this one format will be playable more potential users than any other single choice you could make, due to its popularity and wide support in modern desktop and mobile environments.
Some tools you may find helpful:
MediaInfo will let you peek inside MPEG-flavored video containers to see what codecs are in use. This will be helpful in determining which devices are having trouble with which codecs.
FFmpeg is an encoding application that can convert your content to MP4/H.264
Android Supported media formats
List of supported media audio/video formats.
Good luck!
Ok. So there are a bagillion different Android devices. I have a video streaming service, works wonderfully for iOS. My app has a live video feature and a saved video clip playback feature (which streams to the device too). I've run some tests on different Android devices and get a whole bunch of different playback results. I am using a 640x480 h.264 base profile video. Streaming that video works only on some devices. For other devices, that same video stream can be made to stream at low resolution and that works on some devices, but still not others. The high profile streaming goes through http://www.wowzamedia.com/ (rtsp) and doesn't work on any Android device (but works on iPhone). The lowest and worst option is Motion JPEG, which works on all tested devices so far.
So my question is, how can I figure out (without having to test every device out on the market) if the device will play: 640x480 h.264 base profile - if that wont work then play the low resolution video - if that doesn't work, default to Motion JPEG.
Also, any idea why my rtsp transcoded through wowza works on the iPhone but not on any Android device (not even the Motorola Atrix)?
Streaming on android is an absolute mess. Most devices don't support anything higher than Baseline 3.0. If you encode for iPhone 3, it should generally work via RTSP. Newer versions of android support HLS, but it's hit or miss and largely dependent on specific devices.
I resolved this problem. Check RTP-realization in your streaming service and x264 profile. My RTSP-server works fine on 90% devices.
p.s
Some video frameworks in different Android versions can implement RTP and RTSP protocols with some differences.
These are some of the links/issues which I have come across, while trying to make streaming work in varied devices.
MediaPlayer seekTo doesn't work for streams
MediaPlayer resets position to 0 when started after seek to a different position
MediaPlayer seekTo inconsistently plays songs from beginning
Basic streaming audio works in 2.1 but not in 2.2
MediaPlayer.seekTo() does not work for unbuffered position
Streaming video when seek back buffering start again in videoView/Mediaplayer
Even the big shots in stackoverflow are wondering about this
If you want just streaming without seeking (which is lame), this can be achieved. But then if you receive a call while you are watching, you will end up from the start.