I have an android app in playstore, which plays audio asynchronously using CDN url. When it starts playing an audio, it shows buffering progress and seek progress.
It performs fine with short audio files. But for long audio files it stops buffering in a certain minute. This buffering progress varies in device to device. I observed, in a specific device, if an audio can buffer till 30 min, in another it might be 20 min and always stops buffering in the same minute for that audio. If I seek to the unbuffered area, audio doesn't play not even buffer newly.
Here is the example
In the example-
File size: 37.02MB, Duration: ~54 min, Buffered: 25 - 30 min
Device's memory specification: RAM: 8GB, Storage: 128GB
I guess when an audio is buffered and cached in local storage, the allocated storage or RAM exceeds. If this is the issue, is there any way to clear memory which part was already played and re initiate buffering for rest of the part?
To be mentioned, I implemented the controller by myself instead of using MediaController and I've initiated and used the MediaPlayer object inside the Fragment instead of Service.
Related
I've tried to play an mp3 file stored on device(not an asset) via android emulator with just_audio and audio_players packages and in both cases on first attempt to play the file the first 5-10 are typically super glitchy and stuttery or the audio is playing super fast and then after 10-20 seconds it plays normally and subsequent playthroughs will be perfect. The audio is only about 45seconds. I have tried increasing playback buffer and make sure the audioplayers says its ready before playing and neither makes a difference.
I am developing an android app which plays a video from a server where I stored my video content. But it takes too much time to play a video, using 300-400 kb/s it takes almost 15-20 sec.
I want to know what facts are related to stream faster and how can I solve my problem.
It could be:
The video bitrate is too high for the connection speed
The player initial buffering is too safe
If you're saying your network is 300-400kbps, you probably want to try converting your video to an adaptive format like HLS or DASH, where the player will detect the user's bandwidth and download the best quality version that can start quickly.
From there it could be player configuration, adjusting the amount of buffered video the player waits for before starting playback. But beware, reducing initial buffering can cause rebuffering later in playback.
Android MediaPlayer plays a growing file (from local storage /sdcard) considering its duration is whatever the duration was on start. As an example if it downloaded up to 1 minute this will be considered the total duration and even if the file continues to grow, MediaPlayer will stop (OnCompletion) after 1 minute.
How can I make it continue playing?
I tried to setVideoPath again but it repeats from the beginning. Any Help?
My question is about the relative latency of playing, pausing/stopping, and setting volume of audio in Android. Specifically: whether it's the same or lower latency to pause/stop an audio clip than to play it, and likewise whether it's the same or lower latency to set the volume of a clip (or of the system volume) than to play it.
For context, suppose the latency of playing an audio clip in Android is 150ms, i.e. SoundPool.play is executed at T=0m and the end-user hears the sound at T=150ms.
At T=200m, the program executes SoundPool.pause. If the pause latency is also 150m, that means the pause won't be heard by the end-user until T=350m, after they have heard 200m of the clip. If, however, the pause latency is, say 50m, then the sound will stop at T=250m, after only 100m has been head by the end-user.
Obviously latency isn't constant, exact, or consistent across devices, so to be more precise, what I'm really asking is whether Android uses a separate pathway or technique to pause/stop/change volume of audio (either program-specific or system-wide volume) that is inherently lower-latency than the way audio is played.
Setting up Play takes more time as it has to initialize the play the following actions takes path
find the MIME type of the media file, this needs parsing of the media format and looks for specific header
initialize audio decoder(usually hardware), the OMX decoder has to be loaded into memory
setup the buffers say allocate 10 buffers in the parser and 10 buffers in the decoder.
Setup the paths between parser and decoder and playback audio device (Speaker)
Play happens at this step, data flows from parser buffers to decoder buffers, when the decoder buffers are filled, OMX (decoder framework) will notify player engine, engine passes the buffer data to AudioManager -> AudioTrack etc.
Decoder will again process the data from Parser buffers and this process goes on until EOF or user press pause/stop
During pause latency should be much low than play because, only the data exchnage is paused, but buffers are not released.
During stop buffers are released and player is also released, so need to do same process for play again if user needs to play again.
Volume up and down is simple calls to AudioManager to adjust the call voleumes. So its latency should be lower than play/stop
I am trying to stream audio through a server. I have set up everything, and it's working fine with recording and playing back static audio, but when I am trying to stream an audio there is a delay on the playing side.
I did a Google search, but couldn't find the proper way of doing this. I am using AudioRecord & the Audiotrack Android media API for sending & receiving audio data. Can anybody tell me how to handle this delay?
I have added my code on GOOGLE GROUP to get clear picture.
I had tried in this way, holding 5 chunks of audio data in a buffer which comes through the server & playing back when it fills 5 chunks of data and again getting next 5 chunks of audio data and filling it like that it goes till 1024 bytes of data (it writes to the audiotrack & the play method is called).This too has a delay,any other solutions??
If you're really trying to do this unbuffered, make sure whatever playback tool you're using is trying to play it back without a buffer. You will be hard-pressed to not have a delay. Nothing on TV, radio, etc. is really 'live'--there is always some kind of delay. With internet streams, you're sending a large amount of data constantly. Even besides the time for it to travel, all this data has to be kept in a particular order and nobody wants choppy playback while the enduser's computer attempts playback. I've had flash players for major networks keep massive cache files on my computer while it's handling playback, but their players do not skip/wait to buffer/etc. (If you load up something and notice a few 100 MBs of extra memory being used, maybe even more during playback, that's what that is.)
You might be able to get away with a very small buffer (the standard in the past used to be 30-60 seconds and a lot of players still default to this) using VLC. I have been able to set its buffer very low but it is on incredibly low quality streams/videos. The big problem you have though I'd guess is your playback is setting the buffer and if your playback is set to 60 seconds buffer, it doesn't matter what you do serverside...the client end will wait until it has that much of a chunk and then begin playback.