We are working on an video streaming application in iOS and Android. We are using Wowza server. Application works fine from iOS to iOS and Android to Android. Means a video published from iOS device can be viewed on iOS device but not on Android device.
I know both Android and iOS support H.264 compression and we are publishing H.264 formatted stream. Here a bit confusion I thin H.264 is a compression technique and further defined by MP4, FLV etc please confirm.
What I think iOS publish stream something like .MOV defined H.264 stream which is not supported by Android that's why I think it doesn't work on Android. Please confirm.
Please suggest any way to play a video stream published through ios app to Android device.
There are two aspects to video files: The container and the encoding (or codec). H.264 is an encoding, and Android can deal with it, but Apple uses the QuickTime container format, which is similar to the MP4 container but apparently just different enough that Android can't handle it. Android can play MP4 files, and there are utilities to convert QuickTime to MP4, if that helps.
Related
We provide facility to upload and stream video lectures to different schools. I recommend H264 video codec to be uploaded. We are facing issue on android where H264 is supported by android 6.0.0 and afterwards. Which video format we should be using to stream videos both on android app and web app.
You are correct, the h.264 main profile is only supported in Android 6+
However, if you use h.264 baseline profile, you can go back to Android 3.
Source: Android Developers. You can also check out other formats that are supported there.
I am trying to achieve HLS streaming in Android.
I have setup the HLS streaming server (apache2) in Ubuntu desktop and able to play the stream using the VLC player on Desktop.
But when i try to play the stream using VLC player in Android, I am not play the video, nor I am getting any error.
If anyone has tried similar streaming, please provide your inputs.
Thanks
Following some further investigation, I've found the following information that can hopefully help other people get HLS streaming on Android working.
Encoding - The video encoding, and the segmentation setup can have a large impact on the Android versions that the video supports. I ended up creating a video using HandBrake, with the following settings:
MP4 File
H.264; Baseline Profile; Level 3
AAC Audio; 44.1k; 128bit (Note: I found that JellyBean was a lot more picky about the audio than ICS/Honeycomb. Some audio bitrates would create videos that Jellybean would not play at all. In general Mono and low bitrate audio seemed to work better on Jellybean).
Segmentation - Using the Apple MediaFileSegmenter, I found adding the "-no-floating-point-duration" and "-z none" flags allowed me to create a video that worked across Android 3.0->4.2
Gingerbread - I was unable to get Android 2.3 to work with HLS out of the box, but I did find that using the Vitamio library worked pretty well (see this question for further info)
Is it possible to invoke(deploy) HTTP Live Streaming (HLS) on Android(4.x)?
https://developer.apple.com/streaming/
Obviously iOS devices can both capture/play, and I know android can at least play, but how about capturing? I wonder interoperability.
Thanks.
The best answer I found so far is
Creating a HLS video stream with FFmpeg
12 May 2013
http://walterebert.com/blog/creating-on-hls-video-stream-with-ffmpeg/
For video conversion I use FFmpeg. Creation of HLS is possible with FFmpeg, but not really well documented. So I had to figure out how to create the video streams. After a lot of research and experimentation I created my FFmpeg HLS reference implementation that is available on Bitbucket.
On iOS the created video plays without problems on new devices. Older iOS devices with a maximum resolution of 480×320 pixels seem to select the best quality stream available, even if they cannot play it. For Android you have to create a MP4 video and before converting it into a MPEG stream. Doing this in a single command creates a choppy stream on Android. Flash playback has still some issues if you change the bitrate. So I still have some work to do.
These are the writings of Walter Ebert on web development, web design and free, open source software
Yes. HLS is widely used on Android 4.x.
I am trying to play a video in android native code using new API mediacodec. I dont want to go mediaPlayer way due to unavoidable reasons. can anybody share some code snippet as to how to go about it? Thanks in advance.
Your original question is too generic. And to be honest, create a new media player in native code is a huge task for your own.
If you are only seeking for some media player solution which has better supporting for variety of formats/codecs like VLC player, you can either try VLC lib which is open source but still in beta release. I have tried VLC, but it really has some crash issues or ANR issues, which is inside the whole framework.
Or you can try with Vitamio SDK which is a library without souce code. Check it out at this link: https://github.com/yixia/VitamioBundle Below is the feature list of it:
I have tried this solution, it is very stable, also some minor issue on 4.3, but still acceptable. So I am not posting any spam here, just copying from the official document:
Vitamio is an open multimedia framework or library for Android and iOS, with full and real hardware accelerated decoder and renderer. It's the simple, clean and powerful API of Vitamio that makes it famous and popular in multimedia apps development for Android and iOS.
According to the developers' feedback, Vitamio has been used by more than 1000 apps and 100 million users around the world.
Vitamio can play 720p/1080p HD mp4,mkv,m4v,mov,flv,avi,rmvb,rm,ts,tp and many other video formats in Android and iOS. Almost all popular streaming protocols are supported by Vitamio, including HLS(m3u8), MMS, RTSP, RTMP, and HTTP.
Network Protocols
The following streaming protocols are supported for audio and video playback:
MMS
RTSP (RTP, SDP), RTMP
HTTP progressive streaming
HLS - HTTP live streaming (M3U8)
And yes, Vitamio can handle on demand and live videos in all above protocols.
Media formats
Vitamio used FFmpeg as the demuxers and main decoders, many audio and video codecs are packed into Vitamio beside the default media format built in Android platform, some of them are listed below.
DivX/Xvid
WMV
FLV
TS/TP
RMVB
MKV
MOV
M4V
AVI
MP4
3GP
Subtitles
Vitamio support the display of many external and embedded subtitle formats.
SubRip(.srt)
Sub Station Alpha(.ssa) / Advanced Sub Station Alpha(.ass)
SAMI(.smi/.sami)
MicroDVD(.sub/.txt)
SubViewer2.0(.sub)
MPL2(.mpl/.txt)
Matroska (.mkv) Subtitle Track
More features
More wonderful features
Support wide range screens from small phone to large tablet
Multiple audio tracks support
Mutitiple subtitles support, including external and embedded ones
Processor optimization for many platforms
Buffering when streaming
Adjustable aspect ratio
Automatically text encoding detection
We have developed video streaming applications for Android and iOS. In this application user can publish video and can view live streams as well. App is working fine from iOS to iOS but can not play on Android. If we publish from Android then it plays on Android, RTMP flash player but not on iOS.
From iOS we are publishing video in H.264 format and Android supports H.264 then why it's not playing on Android?
I can only guess, but It could be a problem with file extension. iOS may expect *.m4v and I'm not sure if the Android can manage it. Also check this topic, it may help you: What h.264 format loads on android AND IOS?