I want to use a video player eg. using ExoPlayer in android app which will support switching of resolution like we see in YouTube.
My API have video files for 480p,720p,1080p.
I want to give those options in the player and can switch it from the player itself and will play the respective files from URL.
I have seen solutions like track selector etc,but does that work for online files? I have links like :
www.example.com/videos/480/demo.mp4
www.example.com/videos/720/demo.mp4
www.example.com/videos/1080/demo.mp4
Please suggest if there is any other solutions like API change or any other protocols etc.
Why you don't convert your mp4 file to hls or mpd streaming format with ffmpeg so you can stream it chunk by chunk and the player will select the best resolution based on his algorithm?
Have a look at this project, this allows user to select resolution manually via track selection from hls stream encoded by ffmpeg.
https://github.com/namespace7/HLS_Player
To generate hls stream from a video,go through this link
https://superuser.com/a/1302736/1108219
I want to make an Android app which should be capable of playing at least 6 Live RTSP video streams with functionality of pause and play. Is there any way to achieve this efficiently. Thanks in advance.
Android Video Streaming Layout
I never used it but there is a library for stream in rtmp and rtsp. Here is the github : https://github.com/pedroSG94/rtmp-rtsp-stream-client-java
I'm fascinated by the 360 videos on YouTube as of recently. I'd like to develop a sample Android application that can play a 360 video and be able to pan/swipe with it along with being able to use accelerator/gyroscope.
Few questions:
What file format is 360 video? Where can I download sample 360 video?
Is it even possible to use Android library to play 360 video? If so, what player would I need to use to "play" 360 video?
How can I handle pan/swipe for a 360 video played by the native player?
Is it possible to play an existing YouTube 360 video using Android native Player? And at the same time be able to handle pan/swype/gyro?
Please provide a code sample. Thanks!
360 Videos are just videos recorded by a 360 camera. What separates them from a normal video is just the meta data.
Use panframe library for Android/iOS.
http://www.panframe.com
I need to make video player on Android that is able to play in slow motion and with different playback speeds. I still cannot find a native API or some code to do this. Does Android 2.2 - 2.3 support to video slow-motion and can we control the video playback speed?
Yes, from API 23 android has PlaybackParams class .It supports adjustment of playback speed of video briefly given here .
You can try exploring libVLC for this. VLC player for Android is almost due. Check their website.
Check this thread that discusses how you do it.. http://forum.videolan.org/viewtopic.php?f=14&t=89296
libVLC developer documentation here with normal playback sample. You need to modify this sample to play the video frame by frame or slow down the frame rate.
Closed. This question needs to be more focused. It is not currently accepting answers.
Closed 1 year ago.
Locked. This question and its answers are locked because the question is off-topic but has historical significance. It is not currently accepting new answers or interactions.
I've seen plenty of info about how to stream video from the server to an android device, but not much about the other way, ala Qik. Could someone point me in the right direction here, or give me some advice on how to approach this?
I have hosted an open-source project to enable Android phone to IP camera:
http://code.google.com/p/ipcamera-for-android
Raw video data is fetched from LocalSocket, and the MDAT MOOV of MP4 was checked first before streaming. The live video is packed in FLV format, and can be played via Flash video player with a build in web server :)
Took me some time, but I finally manage do make an app that does just that. Check out the google code page if you're interested: http://code.google.com/p/spydroid-ipcamera/
I added loads of comments in my code (mainly, look at CameraStreamer.java), so it should be pretty self-explanatory.
The hard part was actually to understand the RFC 3984 and implement a proper algorithm for the packetization process. (This algorithm actually turns the mpeg4/h.264 stream produced by the MediaRecorder into a nice rtp stream, according to the rfc)
Bye
I'm looking into this as well, and while I don't have a good solution for you I did manage to dig up SIPDroid's video code:
http://code.google.com/p/sipdroid/source/browse/trunk/src/org/sipdroid/sipua/ui/VideoCamera.java
I've built an open-source SDK called Kickflip to make streaming video from Android a painless experience.
The SDK demonstrates use of Android 4.3's MediaCodec API to direct the device hardware encoder's packets directly to FFmpeg for RTMP (with librtmp) or HLS streaming of H.264 / AAC. It also demonstrates realtime OpenGL Effects (titling, chroma key, fades) and background recording.
Thanks SO, and especially, fadden.
Here is complete article about streaming android camera video to a webpage.
Android Streaming Live Camera Video to Web Page
Used libstreaming on android app
On server side Wowza Media Engine is used to decode the video stream
Finally jWplayer is used to play the video on a webpage.
I am able to send the live camera video from mobile to my server.using this link
see the link
Refer the above link.there is a sample application in that link. Just you need to set your service url in RecordActivity.class.
Example as:
ffmpeg_link="rtmp://yourserveripaddress:1935/live/venkat";
we can able to send H263 and H264 type videos using that link.
Check Yasea library
Yasea is an Android streaming client. It encodes YUV and PCM data from
camera and microphone to H.264/AAC, encapsulates in FLV and transmits
over RTMP.
Feature:
Android mini API 16.
H.264/AAC hard encoding.
H.264 soft encoding.
RTMP streaming with state callback handler.
Portrait and landscape dynamic orientation.
Front and back cameras hot switch.
Recording to MP4 while streaming.
Mux (my company) has an open source android app that streams RTMP to a server, including setting up the camera and user interactions. It's built to stream to Mux's live streaming API but can easily stream to any RTMP entrypoint.
Depending by your budget, you can use a Raspberry Pi Camera that can send images to a server. I add here two tutorials where you can find many more details:
This tutorial show you how to use a Raspberry Pi Camera and display images on Android device
This is the second tutorial where you can find a series of tutorial about real-time video streaming between camera and android device