This is my problem. I have on my hosting an avi video file and I want that the user of my android application can display it in streaming on your smartphone.
There are two way.
1) Convert this file in mp4 format and visual it into my application with MediaPlayer
2) Control that on user's device there is Rockplayer and then to use it for display this avi file.
What the good way ?
Please look at this link. It is very good example for Android Media Streaming.
Media streaming on Android devices using the VideoView object available in the android.widget package. This widget allows audio or video playback local or global resources.
Choose option 1. You don't want to "sponsor" third-party applications such as RockPlayer when building your own app. What if the user doesn't have internet connection by the time he's trying to use your app? And he hasn't downloaded RockPlayer previously? As much as possible you want your app to be independent of other market apps, especially those you do not own.
Related
I have an android application for live streaming and recording from a hardware device through wifi. The hardware device can store images and video within the hardware and we can transfer these videos to our mobile application. I am getting the videos and able to download it to my mobile memory. But The video received in .mts fomat and unable to play it using android native MediaPlayer class. I would like to ask you is there any method/library available to play .mts videos . Thanks in advance
Default native MediaPlayer does not support the .mts file format. Better you use vitamio library for that. refrence link - https://stackoverflow.com/a/8261864/3912847 & https://karanbalkar.com/2014/11/tutorial-92-live-streaming-using-vitamio-in-android/
You can download the vitamio library form here
We want to create a Live-Broadcasting/streaming platform where anyone can go live on click of a button using his/her mobile camera. The same live feed should be viewable from the native mobile App. To start with, we would like to support live broadcast and viewing live feed from both IOS and Android platforms. We are using Wowza as the media streaming server
UseCase : Lets say sitting at home, I want to show my new home to all my friends. I download the mobile app on Android and start live stream on click of the button. My friends, who have also downloaded the same mobile app, can see my live-stream through their mobile. They can also see some of the VOD Content.
I would like to understand how to play the Live Stream / VOD on native Android App ?
Thanks in Advance :)
OpenMAX for Android will give you the best flexibility & control, however, it's a low level API that mandate C++ & NDK usage, you can also use ffmpeg static libs in the same manner
I want to embed video's on our website that should work with these restrictions:
The video needs to start as soon as the user clicks on play. As far as I can tell this means that the video must be either streamed or it must use progressive downloads.
The video must not be downloadable by sending the link to other people.
The video must be protected against being viewed without being allowed to do so.
The video must work on all devices, also the ones that do not support flash such
as iOS and android
As a backend I use amazon cloudfront. So far we have used RTMP, but that obviously does not work for iOS or android devices.
What we’re planning to do is this:
For flash platforms we use RTMP with the amazon signed URL’s to prevent anyone to view the content.
For iOS we want to use HLS with a generated m3u8 file that contains signed URL’s to the TS files
For Andoid devices I'm not yet sure what to use.
My questions are these:
Is this a viable setup, or are is there a superior setup that ticks all the boxes?
What should we use for the android case?
I would suggest, use pre-signed hls/m3u8 for both ios and flash.(https://github.com/mangui/HLSprovider). For Android you can use normale html5 video streams with quality selection (signed URLs of course).
For Android you have to use RTSP protocol.
Wowza media server is the perfect solution for you.
In my application I need to provide the user with a preview on a progressive download (video file).
In order to achieve this, I'm using VideoView component to show the content of the video (.mp4, .3gpp) which is being downloaded.
The problem is that I can't access remote media via http:// or rtsp:// protocol, so I'm forced to use VideoView.setVideoPath to play local copy of the video while downloading.
Unfortunately it seems like on Android devices that can't use StageFright framework (so OpenCore and some Tegra2-based devices in my experience), the VideoView can't handle progressive download correctly: it can play only the portion of the video recognized during the component initialization.
So to be clear: if the user press "play" when only 5% of the video has been downloaded, VideoView will show only that 5% of video, no matter if more video content ha been downloaded in the meantime.
In my experience this issue doesn't affect devices using StageFright framework (e.g.: Nexus One 2.2, Nexus One 2.3.4).
Can anyone point me to a possible solution ?
Thanks in advance
If you are trying to play .h264, You need to move the MOV atoms to the front of the file. These tell the codec the length of the movie among other things.
try qtfaststart
http://ffmpeg.zeranoe.com/builds/
VideoView is a ready to use class in Android Application Framework for video playback use case. It internally uses MediaPlayer class to achieve playback. MediaPlayer uses media frameworks available internally based on certain criteria like file format , origin of content etc.
So the limitation is from underlying framework and not VideoView. If you are still suspecting VideoView then write your own video player activity using media player. You will see the same result.
Also not all versions of Android (read as stagefright) support progressive download.
May be you can use DownloadManager class for downloading content from http server and provide the path to media player / video view for a quick preview.
Shash316
Can someone please explain the steps I need to take in order to add a
new codec to Android?
Also, I would like the codec to be installed as part of an application
installation (or first launch) and NOT as part of a full Android OS
build.
The reason I want to do this is that I have an application that needs
to show a video of a non supported codec (HLS or TS), but I wouldn't
want to build a full blown video player - just integrate with the
existing, built-in, player.
Thanks,
Alik.
Can someone please explain the steps I need to take in order to add a new codec to Android?
Build your own firmware, or build your own media player (like VLC for Android).
Also, I would like the codec to be installed as part of an application installation (or first launch) and NOT as part of a full Android OS build.
That is not possible, unless you build your own media player.
The reason I want to do this is that I have an application that needs to show a video of a non supported codec (HLS or TS), but I wouldn't want to build a full blown video player - just integrate with the existing, built-in, player.
VLC for Android is due out (at least for some phones) shortly, so it may be able to play your format.
I think it maybe possible to add custom codec(though I have not tried) by referring to the android developer page Adding custom codec to android.
You can try out adding your codec through openMAX IL layer then call up the android media player to play it(I believe vlc has done in this way but uses its own player). The awesome player, the android default player, just fetch a list of codecs available through openMAX API and if there is a codec, it plays. So it is worth to try adding your codec during initialization of your app, and call up media player.