I have net stream that plays mp4 files form Amazon S3.
Currently when I call _ns.seek(time) if the file is buffered to that point the seek will work, else it will wait until the video is linear buffered to start playing.
How to make the net stream start playing immediately from seek point?
Related
I want to play audio file backward using exoplayer (or other media player) in Android Device.
It says to be very difficult to play video backward:
https://github.com/google/ExoPlayer/issues/2191
But, if I can reverse the pcm sample data stream going in player, I can play audio backward.
I'm now trying reverse part of music file using 'android.media.MediaExtractor' & 'android.media.MediaCodec'. But I can't create an reversed audio stream because I don't know how to create the header of sample data in short array.
The best option is to play audio file backward in ExoPlayer, the second option is create reversed audio stream and play it, the last option will be any method to play audio backward.
Thanks for your attention. I’m looking forward to your reply.
We are having problems about the player.
We only play mp4 files.
When there is low internet connection, audio starts before the video and the video starts playing after 2-3 seconds. There is no synchronization problem meaning when video appears the audio and video synchronized. But there is a latency with the video.
What can we do about this?
I have mute video file and separate audio file for same video. How can I play video with separate audio in android? I have tried using VideoView but unable to play audio separately. can anyone help in this!!
Before you start everything running, first you prepare both video and audio. When both of them are prepared only after that you can push the Start button.. and they should be synchronized ( if you recorded the audio part at the same time you started the video part)
I am developing a Android video player that can play RTSP Stream. I use ffmpeg in jni part to get and decode RTSP Stream. For now, the player can play and then pause video stream. The next step is to create a buffer for the player so that when user pauses video, the player can still load video stream in the next several seconds.
Is there any good documentation on how to create a buffer for video streaming in proper way?
My plan is to create a array of packets. When the array is full, the player calls
av_read_pause();
to stop buffering. When the array has spaces, the player will call
av_read_play();
to continue buffering. There is a read_thread for getting packets from the buffer and the decode the packets. The read_thread will stop (resume), when user pauses (resume) video.
Can this plan work?
I have made a Android streaming application that plays media from online URL's. For playing the media, I am using the
standard MediaPlayer class for playing media.
As per the Android documentation, it supports RTSP protocol for audio & video playback
http://developer.android.com/guide/appendix/media-formats.html
But when I am trying to play media from a RTSP URL, it gets connected but I am not able to hear any media
Following is one of those RTSP URL -
rtsp://sfera.live24.gr/sfera4132
Does anybody media have an idea of playing RTSP URL's through the Android MediaPlayer
Thanks
That link you provided has 3 audio tracks, with the first and last tracks appearing to be silent and don't contain any valid audio.
The middle track has audio (as per VLC). I don't know how Android deals with multiple audio tracks. I imagine you may get better results if you use links that only contain 1 audio and 1 video track at most.
I expect for an rtsp stream with multiple audio tracks, android is only going to play the first one as there is no user interface to select a specific audio stream, hence why you aren't hearing any audio.
If this is a stream from your own server, to hear the audio you should adjust the SDP file the valid audio track first. If this is not from your server, I don't know what you're options are.