We are developing music streaming application for Android devices with adaptive bit rate support. We are using Wowza as streaming server which will stream songs using HLS. We have converted each song into four bitrates and have created smil files which are referred in HLS URLs. So basically final URL will look something like this.
http://streaming.server.name:1935/vod/smil:audiofile.smil/playlist.m3u8.
We have tested the app on multiple android devices ( Android version 4.0 or later ) including Galaxy S2, Galaxy Note I, Sony Experia, Google Nexus etc.. On all these devices the songs are getting streamed except on Galaxy S3. On Galaxy S3 the song starts and plays till 6 secs. After that the song goes in loop and plays again and again 6 sec part.
The same behavior is observed when the above URL is accessed through browser.
We have tried to check wowza logs but no errors were reported.
Has anyone tried such thing on S3 devices. Any guidance on how to debug this issue will be greatly appreciated.
The Android documentation states (not clearly enough though) that the TS container format only supports AAC audio. Your are trying to play HLS with TS segments containing MP3 audio data (which is supported by iOS).
I can share the observation that doing so works on most Android devices, but not on all.
The S-III is an example of that.
Please refer to the Android Supported Media Formats section in the documentation.
Related
I am building an application using electron.js that runs a socket server on a local network.
The idea is that the socket server sends messages to connected clients to dynamically load multiple html5 videos in a browser on a smart TV.
The videos are small demonstration videos and need to autoplay and loop.
There can be up to 12 videos playing at once on one device.
The problem that I am experiencing is that I cannot get smooth playback of the videos or not all videos will play.
I have tried the following with 12 videos:
Native Smart TV browser. Result: Only plays 3 videos and playback is ok
Chormium Browser on Raspberry Pi 3. Result: Can play all 12 videos but playback is choppy. Forcing hardware acceleration crashes the browser
Chrome Browser on Quad core Android box. Result: Can only play 6 videos and playback is good
ionic cordova app on Android Box. Result: Can only play 6 videos and playback is good
react-native app on Android Box. Result: Can only play 6 videos and playback is good
My question is, what are my limitations or possible solutions?
Am I limited by hardware or is there something I can do in my application builds that may resolve the issue?
It is worth pointing out that on my desktop chrome browser, all 12 videos play without any problem but this is not an option for me because they need to run on the TV.
Edit: It is worth mentioning I am using mp4 video
I am going to answer this myself to help others if they come across the same issue.
I ended up installing ffmpeg and encoded webm video instead of mp4 and could get all 12 videos playing.
Then by reducing the bitrate, I was able to get all 12 videos playing at the same time on the mini PCs at an acceptable quality.
I have upload and play videos on both Android and iPhone devices but video uploaded from iPhone is not working on Android.it's not play in android video player. It's give me error message
"sorry this video can not be played"
video is in mp4 format.
Yes, That's right.
It happens because the android support limited codecs in-built like mp3,mp4,mpeg.
While iphone support most of codecs.
What is the way to resolve this?
MP4 for video and MP3 for audio are widely accepted and work on both platforms.
So you need do some stuff at the server. Implement the ffmpeg library that will convert all the videos to MP4 and audio to MP3.
We are doing same mechanism to resolve this issue.
Find FFMPEG implementation for PHP Here and
Command to convert all videos to MP4 Here
Hope this helps you.
Thanks.
If it is mp4, then you need to check what codecs are used. iPhone usually encodes everything in h264, however, there are different profiles of h264 and high profiles might not be supported on Android, because they are more complex for decoding.
Even apple says in their documentation:
H.264 Baseline Level 3.0, Baseline Level 3.1, Main Level 3.1, and High Profile Level 4.1.
iPad, iPhone 3G, and iPod touch (2nd generation and later) support
H.264 Baseline 3.1. If your app runs on older versions of iPhone or
iPod touch, however, you should use H.264 Baseline 3.0 for
compatibility. If your content is intended solely for iPad, Apple TV,
iPhone 4 and later, and Mac OS X computers, you should use Main Level
3.1.
Baseline profile should be played everywhere.
See the list here - http://en.wikipedia.org/wiki/H.264/MPEG-4_AVC#Profiles
So if you have control over encoding (if the video is recorded from your iOS application), then you can do it programmatically. I just googled and found a piece of code where the profile is set: http://forums.macrumors.com/archive/index.php/t-1512924.html
I have created on application for animal sound. It plays the sound on touch of image. It is working properly in my samsung galaxy ACE. But It doesn't work properly in some devices like HTC Flyer, motorola. I mean I can't hear sound for some animal.For some animal it is working properly in all devices.
FYI: For playing sound I have used soundpool. I have used .MP3 and .WAV files.
Media codecs that are not guaranteed to be available on all Android platform versions
refer this
I am building an app that playing live radio stream using android mediaplayer class(not using open source nagare project). The stream is playing fine in emulator and g1 phone. But when I tested the app on my samsung galaxy i9003 phone it did not work and it throws exception prepare called in wrong state. Do samsung phone have trouble playing live radio url. If anybody have experience playing live radio stream using mediaplayer please give your suggestions.
Thanks
By default android does not support m3u formats with URL. So, it needs to be implemented explicitly. may be in G1 it is implemented. In any case the codec used in case of Samsung is samsung specific and different from the one used in G1, so can be some issue with their codec. Can you share your code with me? I will try to fix the problem.
I wrote an app for playing radio stream (tried with shoutcast) using android MediaPlayer class and tested the application on samsung nesus. It is working fine for me. So, may be there is some issue with your code where you are doing playlist parsing. You can just have a relook into the parsing code.
Ok. So there are a bagillion different Android devices. I have a video streaming service, works wonderfully for iOS. My app has a live video feature and a saved video clip playback feature (which streams to the device too). I've run some tests on different Android devices and get a whole bunch of different playback results. I am using a 640x480 h.264 base profile video. Streaming that video works only on some devices. For other devices, that same video stream can be made to stream at low resolution and that works on some devices, but still not others. The high profile streaming goes through http://www.wowzamedia.com/ (rtsp) and doesn't work on any Android device (but works on iPhone). The lowest and worst option is Motion JPEG, which works on all tested devices so far.
So my question is, how can I figure out (without having to test every device out on the market) if the device will play: 640x480 h.264 base profile - if that wont work then play the low resolution video - if that doesn't work, default to Motion JPEG.
Also, any idea why my rtsp transcoded through wowza works on the iPhone but not on any Android device (not even the Motorola Atrix)?
Streaming on android is an absolute mess. Most devices don't support anything higher than Baseline 3.0. If you encode for iPhone 3, it should generally work via RTSP. Newer versions of android support HLS, but it's hit or miss and largely dependent on specific devices.
I resolved this problem. Check RTP-realization in your streaming service and x264 profile. My RTSP-server works fine on 90% devices.
p.s
Some video frameworks in different Android versions can implement RTP and RTSP protocols with some differences.
These are some of the links/issues which I have come across, while trying to make streaming work in varied devices.
MediaPlayer seekTo doesn't work for streams
MediaPlayer resets position to 0 when started after seek to a different position
MediaPlayer seekTo inconsistently plays songs from beginning
Basic streaming audio works in 2.1 but not in 2.2
MediaPlayer.seekTo() does not work for unbuffered position
Streaming video when seek back buffering start again in videoView/Mediaplayer
Even the big shots in stackoverflow are wondering about this
If you want just streaming without seeking (which is lame), this can be achieved. But then if you receive a call while you are watching, you will end up from the start.