I am developing a Android video player that can play RTSP Stream. I use ffmpeg in jni part to get and decode RTSP Stream. For now, the player can play and then pause video stream. The next step is to create a buffer for the player so that when user pauses video, the player can still load video stream in the next several seconds.
Is there any good documentation on how to create a buffer for video streaming in proper way?
My plan is to create a array of packets. When the array is full, the player calls
av_read_pause();
to stop buffering. When the array has spaces, the player will call
av_read_play();
to continue buffering. There is a read_thread for getting packets from the buffer and the decode the packets. The read_thread will stop (resume), when user pauses (resume) video.
Can this plan work?
Related
I want to stream a video from my Android application on Youtube but I want to stream a Stream not just give the Camera to the Youtube api and let it fo the job, I want create a stream by myself so that I can modify it first and add some effects on the video, then convert it to a stream, then stream it.
I am ok with the first part, the stream is ready, modified, filtered, effects are added. How to stream that stream to Youtube from Android?
I have net stream that plays mp4 files form Amazon S3.
Currently when I call _ns.seek(time) if the file is buffered to that point the seek will work, else it will wait until the video is linear buffered to start playing.
How to make the net stream start playing immediately from seek point?
I am trying to record the online streaming (Radio station) while station plays in MediaPlayer.
As android does not provide any way to record audio from MUSIC source (as we did for MIC, VOICE_UPLINK, etc).
So I have created a local web server in my app, and gives that local url to MediaPlayer to play station streaming.
In my local web service I did following:
Open URL connection from original stream url.
Create read buffer. Original stream url (bitrate / 8) E.g. bitrate = 128 kbps so buffer size will be ((128 / 8) * 1024) i.e. 16 KBps.
Get input stream, and read bytes from stream and write to output stream of web service.
MediaPlayer plays streaming without any problem. But when I record the audio then starting point of recording is wrong (E.g If I record audio from 5:15 minutes then recording starts from 5:20 minutes, 5 seconds audio data missed) and this difference increased as playing time increases.
In short: I have facing the problem fast downloading speed and slow playing speed by MediaPlayer.
Is my calculation is wrong to create read buffer size? how can I match the downloading and playing speed of MediaPlayer?
I am developing an application in which I have to perform three functions play, record and pause an audio file.
Has anyone implemented it before?
Take a look at MediaPlayer or AudioTrack for playback. The difference is that MediaPlayer can play several audio formats directly from file (or in some cases even from remote URL), while AudioTrack plays only from raw LPCM buffer,
For recording take a look at MediaRecorder or AudioRecord. The difference is that MediaRecorder records audio and video to .3gp, while AudioRecord gives you only audio as raw LPCM buffers. The AudioRecord data can be used to create .wav files (though some extra code is required for this).
I have made a Android streaming application that plays media from online URL's. For playing the media, I am using the
standard MediaPlayer class for playing media.
As per the Android documentation, it supports RTSP protocol for audio & video playback
http://developer.android.com/guide/appendix/media-formats.html
But when I am trying to play media from a RTSP URL, it gets connected but I am not able to hear any media
Following is one of those RTSP URL -
rtsp://sfera.live24.gr/sfera4132
Does anybody media have an idea of playing RTSP URL's through the Android MediaPlayer
Thanks
That link you provided has 3 audio tracks, with the first and last tracks appearing to be silent and don't contain any valid audio.
The middle track has audio (as per VLC). I don't know how Android deals with multiple audio tracks. I imagine you may get better results if you use links that only contain 1 audio and 1 video track at most.
I expect for an rtsp stream with multiple audio tracks, android is only going to play the first one as there is no user interface to select a specific audio stream, hence why you aren't hearing any audio.
If this is a stream from your own server, to hear the audio you should adjust the SDP file the valid audio track first. If this is not from your server, I don't know what you're options are.