Chromecast receiver - extremely poor framerates - android

We are experiencing severe issues with playback fidelity on Chromecast. Using the most basic implementation of CAF or v2 on the receiver, the video serves perhaps 2 frames per seconds, showing a lot of artifacts. We are using HLS (served via Akamai). After some research, here's what we know:
The same HLS streams play without issue on AVPlayer (iOS), EXOPlayer (Android) and Clapper (web).
When casting the underlying mp4 file (unsegmented), the chromecast handles it beautifully.
The issue is not with buffering itself (the playerState is not buffering, there is no buffering icon on the player, and the internet connection is solid)
When forcing the chromecast to use the lowest quality video from the m3u8 manifest (240p), the player performs better (but still not perfectly).
For now, it seems to us that the HLS segments are not being processed properly (while most players handle them well). Is there anything we can do to resolve this? Our product depends on a high-quality Chromecast experience.

Related

How to record (and process?) a video that is streamable from Android

My company's app relies heavily on video recording and playback of web-based videos. I use the MediaRecorder API to record videos, through this library designed by me: https://github.com/afollestad/material-camera.
For playback, I use this library which is basically a wrapper around Google's ExoPlayer library: https://github.com/brianwernick/ExoMedia.
It works fine for the most part with small videos, especially if I decrease bit rates for audio and video. However, larger and higher quality videos have many issues. Sometimes they seem to buffer forever, sometimes playback doesn't even start successfully, etc. Again, these videos are being streamed over HTTP from Amazon S3.
I've read a little bit about FFMPEG, and how it can process MP4's for "faststart", splitting the files into chunks for DASH, etc. However, FFMPEG solutions for Android seem a bit complex, so...
Is there anyway to record MP4's from Android, with MediaRecorder, MediaCodec, or some other API which results in a video file that is fast to stream? It amazes me how well Snapchat has figured this out.
Well, I ended up trying the stock MediaPlayer API again. As long as you're on API 16 or above, there should be no major issues with the default hardcoded buffer size.
I ended up making a small library in case others need a nice solution to video playback: https://github.com/afollestad/easy-video-player

How can Periscope App broadcast the video so successful?

Twitter's new application Periscope broadcasts video. I watched a broadcast just a couple minutes ago first time and I wonder that how can it stream live video without any freezing or annoying freezing (Actually I didn't see any freezing but maybe there was somebody has) on 3g? 2 or 3 weeks ago I have tried Twitter's video post feature and it was a disastrous. What is the difference between live streaming and recorded video uploading? Or is it difference between iPhone and Android?
The answer is not that simple.
HLS is for example how they do it on the web and how Meerkat does that using short segment sizes to speed up the buffering and playlist creation that HLS creates.
On mobile they show a 2-3 seconds latency which I never saw using HLS.
Sniffing the connections themselves I see that on mobile they are using RTMP which is way more expensive and way less scalable to give that experience.
Here is a short article talking about that - note the comments about the rtmp playback:
http://www.alamtechstuffs.com/periscope-livestreaming-app/
There's no secret, it's a well established technique which is not Twitter specific.
Uploaded videos are fetched using pseudo-streaming (progressive download) while the live stream is delivered using adaptive bitrate streaming which means there are multiple renditions of the same live stream for different bandwidths. The player can then choose one version that makes the most of your connection.
http://en.wikipedia.org/wiki/Adaptive_bitrate_streaming

Adaptive streaming - smoothstreaming doesn't works

From documentation: "Adaptive Streaming - Automatically adapts to either congestion or bandwidth availability". But this works only when player starts (I use VideoView). If the intrenet speed falls while playing video - nothing happens but it is preferable that player switch video quality base on curren internte speed. So, questions:
Does android video player supports switching video quality in real time while playing video?
If yes, how to implement this?
Thank you for your attention.
Update:
For example Youtube google tv app. If bandwidth changing while playing video, player automaticaly switches to appropriate video quality without any delays.
What I have to do to make it work? I am using VideoView and it is works only when I start player.
Thank you.
Typically you would implement your own subclass of the VideoView that utilizes some form of QoS to monitor network bandwidth. If you are working on a Google TV application you can use the GtvVideoView (which supports smoothstreaming). To understand more about this you can read up on it here: https://developers.google.com/tv/android/articles/hls?hl=en

Disable Buffering on Android VideoView / MediaController

We are writing an Android App on the Samsung Galaxy Tab. We have an Endoscope (Medical Surgery Camera) as DV-input and we want to live-stream the DV Video to the Tablet.
As a streaming server, we have the VLC player and a RTSP stream. The encoding works fine, and streaming over the network(rtsp) to another computer is good (< 1s). However, if we open the RTSP stream on the Galaxy Tab, there is a lag of 6-7s.
I have tried to set down the bitrate encoding (even the lowest doesn't work, the streaming lag results the same). So I think there must be some kind of network caching or video caching on the Android itself.
I googled and didn't find a way to disable or even modify caching on the videoview / mediacontroller.
Does anyone have an idea how to tweak the Android Streaming View?
Edit:
I figured out that it must be the internal buffer size that limits the video stream velocity. The LogCat tells me that AweSomePlayer is the Videoplayer in charge. So next question: How to change the awesomeplayer buffering size? I think it's written in cpp. How can I access this precompiled code via eclipse/java/android?

Android Video Streaming - Device supported?

Ok. So there are a bagillion different Android devices. I have a video streaming service, works wonderfully for iOS. My app has a live video feature and a saved video clip playback feature (which streams to the device too). I've run some tests on different Android devices and get a whole bunch of different playback results. I am using a 640x480 h.264 base profile video. Streaming that video works only on some devices. For other devices, that same video stream can be made to stream at low resolution and that works on some devices, but still not others. The high profile streaming goes through http://www.wowzamedia.com/ (rtsp) and doesn't work on any Android device (but works on iPhone). The lowest and worst option is Motion JPEG, which works on all tested devices so far.
So my question is, how can I figure out (without having to test every device out on the market) if the device will play: 640x480 h.264 base profile - if that wont work then play the low resolution video - if that doesn't work, default to Motion JPEG.
Also, any idea why my rtsp transcoded through wowza works on the iPhone but not on any Android device (not even the Motorola Atrix)?
Streaming on android is an absolute mess. Most devices don't support anything higher than Baseline 3.0. If you encode for iPhone 3, it should generally work via RTSP. Newer versions of android support HLS, but it's hit or miss and largely dependent on specific devices.
I resolved this problem. Check RTP-realization in your streaming service and x264 profile. My RTSP-server works fine on 90% devices.
p.s
Some video frameworks in different Android versions can implement RTP and RTSP protocols with some differences.
These are some of the links/issues which I have come across, while trying to make streaming work in varied devices.
MediaPlayer seekTo doesn't work for streams
MediaPlayer resets position to 0 when started after seek to a different position
MediaPlayer seekTo inconsistently plays songs from beginning
Basic streaming audio works in 2.1 but not in 2.2
MediaPlayer.seekTo() does not work for unbuffered position
Streaming video when seek back buffering start again in videoView/Mediaplayer
Even the big shots in stackoverflow are wondering about this
If you want just streaming without seeking (which is lame), this can be achieved. But then if you receive a call while you are watching, you will end up from the start.

Categories

Resources