I am trying to create an Android App to stream live/archived videos from my church's website.
However, I ran into a problem because all of the streams are giving .flv (flash) videos and or flash players...
I have succesfully been able to load .3gp videos in a VideoView but because Android doesn't support flash natively I tried to open the videos via the WebView.
This didn't work. At least, not for the links that I am working with. However, I can open youtube.com and click on any video to play it - but I can't play any of the streams from the church website.
My question:
Is there any way for me to make this work?
I have access to
1) rtsp stream of .f4v
2) http stream of .m3u8
3) rtmp stream of .fv4
I have spent 2 days searching the web for ideas or fixes and everything I find doesn't seem
to work with my particular case.
It seems to me that the only option is to have the church stream direct .3gp/mp4 files that I can access.
Otherwise, I have no clue how to make .f4v files work. No luck with the WebViews yet..
Do any of you have any suggestions for me?
P.S. I will also have to create an iOS app so looking for a solution that will work on
both platforms.
Thanks for your time!
To answer my own question:
It seems that the android emulator cannot play flash/m3u8 files.
However, my nexus 7 does just fine with both VideoView and WebView!
Cool library I found is Vitamio that is supposed to solve the problem I had.
I didn't use it however.
Related
I need to stream RTSP links within a VideoView, and in the case of RTSP links with a .mov output such as rtsp://184.72.239.149/vod/mp4:BigBuckBunny_115k.mov work fine. However, this one RTSP link I got from someone confidential has a h264 output according to what VLC Player says.
TL;DR how do you implement the streaming of these type of RTSP links and if there's no clear way to code it, are there any external libraries for Android Studio that easily this because I'm kinda at my loss here.
EDIT: Changed title. Streaming can be interpreted differently as in sending RTSP videos from your Android. That's NOT what I want to accomplish. A lot of examples on GitHub are heavily focused on the SD Card Storage and sending it outward, but I am still looking a way to play RTSP videos with h.264 output in my application.
Try to use MediaPlayerSDK from VXG. That is the only one that seems to be working and is open source.
https://github.com/VideoExpertsGroup/MediaPlayerSDK
However as per some posts this can be done using libFFMPEG as well. I didnt try it, but you can give it a try.
I am currently developing a video streaming feature for one of my android apps. I am using android media framework for the purpose. Videos are streamed from an nginx server. Android recorded videos works fine but iOS recorded videos plays only the video not the sound.
It happens because the android support limited codecs in-built like mp3,mp4,mpeg.
While iphone support most of codecs.
What is the way to resolve this?
MP4 for video and MP3 for audio are widely accepted and work on both platforms.
So you need do some stuff at the server. Implement the ffmpeg library that will convert all the videos to MP4 and audio to MP3.
We are doing same mechanism to resolve this issue.
Some more information to understand the problem
Refer stackoverflow answer here
Hope This may help you to get the rid of your problem
Happy Coding!
Anyone would have a recomendation on how to improve HLS on Android when running XBMC
I read that native HLS on Android is pretty bad but no recomemnded solution on how to improve.
Not even sure XBMC uses the native HLS decoding
Its just impossible to watch a video..
Cache some of the video, plays, stops to cache, plays,....
Would really apreciate.
I use one of the Android Stick. It works great for everything except when videos are HLS..
(Also use XBMC on tablet, phone,..)
You can use Vitamio for Android to run HLS.
I use it and it works perfectly.
Here the short of it.
I'm using an HLS stream with JWPlayer6 for an iOS/Android app I am working on. JWPlayer http://www.longtailvideo.com/ works well and fallbacks to other streaming and player types gracefully. The problem is this, when I want to securely play back audio only in iOS the player is just a condensed black rectangle with a play icon in the middle if I click on the audio it plays fine but it launches the file in quicktime window, and the quicktime player window completely covers the app so you can't listen to the file and continue to use the app.
Is there any possible way to play an audio only stream that utilizes an actually embed-able player instead of the default mechanism of launching audio media types on iOS and Android.
I have already used JPlayer to achieve the functionality I want by placing the js/css player at the bottom of the app in a fixed position so that you can still browse the app while the file is being played, but this is only demo solution for my boss to show him how it would work, but in the end we will need a solution that can securely stream the audio with a player that lets you keep the controls in-app on the page rather than launching quicktime which even after playing the file does not return you to the apps other views.
Any suggestions would be greatly appreciated.
Lol...anyway finally figured out a solution that works for me in my particular case. Since the html side of things for our application resides in a webview on iOS, thanks to phonegap, I was able to do the following.
HTML5 inline video on iPhone vs iPad/Browser
and now the player finally plays inline just like the iPad. I tested this and it works flawlessly. I also tested it without added the obj-c flag in the app code and it does not work. This means the solution I've provided only works if you are packaging you application using html and obj-c like with phonegap or Titanium etc. This is EXACTLY what I was looking for. Now I can use all of jw's awesomeness and still play nice with iOS and give a great user experience to all of our members...thanks for you time and patience. As a side note the inline audio only player even works flawlessly with Android to write of of the box not java code needed to edit anything. Surprised me because I know how HLS is not really all that well supported yet on the Androids.
I have just referred to the PhoneGap Documentation for Media, which stated:
"The Media object provides the ability to record and play back audio files on a device."
I wonder whether video streaming (RTSP) is possible or not with PhoneGap framework? I am trying this out on Android.
Any suggestions highly appreciated.
Yes, streaming to the device is possible.
Just like with a normal website/webapplication.
I don't know if it's done yet, but this is a audio stream script:
http://www.joeldare.com/wiki/play_an_mp3_audio_stream_in_phonegap
I hope that will get you started.
EDIT:
Maby this will also help you to get started:
http://groups.google.com/group/phonegap/browse_thread/thread/584028fe07b0c869/47a1af617c94540a?#47a1af617c94540a
Simply the streaming video can be opened up in a child browser or in App browser to play the video. This solution will work both on Android as well as iOS.
Also if the streaming will be opened in the browser it will appear in full screen mode as q media player will play the video on safari.
Hope this will help.