I am trying to record the online streaming (Radio station) while station plays in MediaPlayer.
As android does not provide any way to record audio from MUSIC source (as we did for MIC, VOICE_UPLINK, etc).
So I have created a local web server in my app, and gives that local url to MediaPlayer to play station streaming.
In my local web service I did following:
Open URL connection from original stream url.
Create read buffer. Original stream url (bitrate / 8) E.g. bitrate = 128 kbps so buffer size will be ((128 / 8) * 1024) i.e. 16 KBps.
Get input stream, and read bytes from stream and write to output stream of web service.
MediaPlayer plays streaming without any problem. But when I record the audio then starting point of recording is wrong (E.g If I record audio from 5:15 minutes then recording starts from 5:20 minutes, 5 seconds audio data missed) and this difference increased as playing time increases.
In short: I have facing the problem fast downloading speed and slow playing speed by MediaPlayer.
Is my calculation is wrong to create read buffer size? how can I match the downloading and playing speed of MediaPlayer?
Related
I have an android app in playstore, which plays audio asynchronously using CDN url. When it starts playing an audio, it shows buffering progress and seek progress.
It performs fine with short audio files. But for long audio files it stops buffering in a certain minute. This buffering progress varies in device to device. I observed, in a specific device, if an audio can buffer till 30 min, in another it might be 20 min and always stops buffering in the same minute for that audio. If I seek to the unbuffered area, audio doesn't play not even buffer newly.
Here is the example
In the example-
File size: 37.02MB, Duration: ~54 min, Buffered: 25 - 30 min
Device's memory specification: RAM: 8GB, Storage: 128GB
I guess when an audio is buffered and cached in local storage, the allocated storage or RAM exceeds. If this is the issue, is there any way to clear memory which part was already played and re initiate buffering for rest of the part?
To be mentioned, I implemented the controller by myself instead of using MediaController and I've initiated and used the MediaPlayer object inside the Fragment instead of Service.
I am developing a Android video player that can play RTSP Stream. I use ffmpeg in jni part to get and decode RTSP Stream. For now, the player can play and then pause video stream. The next step is to create a buffer for the player so that when user pauses video, the player can still load video stream in the next several seconds.
Is there any good documentation on how to create a buffer for video streaming in proper way?
My plan is to create a array of packets. When the array is full, the player calls
av_read_pause();
to stop buffering. When the array has spaces, the player will call
av_read_play();
to continue buffering. There is a read_thread for getting packets from the buffer and the decode the packets. The read_thread will stop (resume), when user pauses (resume) video.
Can this plan work?
On Android device I am playing one video URL which is HLS video stream.
I am providing path for M3U8 file to android VideoView.
This M3U8 File has different versions of Video divided by bandwidth/bit rate (Variable bitrate video streams).
It is task of VideoView to detect the current bandwidth of the device to server connectivity and request the appropriate video stream, so that video plays smoothly.
But VideoView is not doing that, for example:
If my device has bandwidth of around 30 KBPS then VideoView is sending request for 90 KBPS video stream and because of that my video is not playing properly and also if my bandwidth is more and 1 MBPS then also it is requesting some random stream.
Can we improve VideoView's bandwidth detection and provide him with correct bandwidth values so that he can request for the proper video stream according to current bandwidth, does VideoView provides this type of API's or can we some how hack it.
I am streaming from an mp3 stream using the MediaPlayer.
I set the datasource, call preparyAsync(), call start() on the onPrepared() callback.
Now I want to know the number of bytes downloaded by the mediaplayer when streaming a song.
Is there a way to track the number of bytes?
Its not going to be one song but instead an mp3 stream.
I understand you want to know the number of bytes of downloaded compressed data, not the number the bytes of uncompressed data inside MediaPlayer which is proportional to the lenght of audio played. (The compressed data may not be proportional in case of using VBR for example).
To get the number of bytes of downloaded compressed data you need to count the bytes of stream downloaded by MediaPlayer, but I believe the only way to have access to the stream downloaded is to implement a local streamproxy that feeds the MediaPlayer, or a second solution is to save the stream to a local file and open this file with MediaPlayer, but the later solution has some limitations as MediaPlayer locks the local file and data cannot be appended.
For the first solution see this solution on how to implement a proxy to feed MediaPlayer: MediaPlayer stutters at start of mp3 playback
I'm playing video on Android using media player via RTSP. The player takes about 12s to buffer before it starts playing. Anyone know how I can convince the player to buffer less? I have full control over the RTSP server and the SDP it returns.
As per usual, as soon as I decide I should ask a question I work out the answer. I have a line "b=AS:91" in my SDP. If I reduce the number the amount of buffering decreases - so b=AS:2 gives about 4 or 5s buffering.