How to loop a video seamlessly in Android Studio - android

I'm currently creating a fitniss app in Android studio and for every exercise
there is a video loop playing in a VideoView
Now my problem is that the loop is not seamless there is a short break after every play. The videos cant be local because the app would get to big for the playstore
My Code in Kotlin:
private fun setExerciseVideo(){
vvVideo.visibility = View.VISIBLE
val uri: Uri = Uri.parse(exerciseList![currentExercisePosition+1].getVideo())
vvVideo.setVideoURI(uri)
vvVideo.start()
vvVideo.setOnPreparedListener {
mMediaPlayer = it
mMediaPlayer!!.isLooping = true
}
}

The code above looks good, but the issue might be the order the code is executed.
What happens if you move .setVideoURI(uri) .start() after the setOnPreparedListener?
If the suggestion above does not work you can try this also:
val uri: Uri = Uri.parse(exerciseList![currentExercisePosition+1].getVideo())
vvVideo.setVideoURI(uri)
vvVideo.setOnPreparedListener {
mMediaPlayer = it
mMediaPlayer!!.isLooping = true
mMediaPlayer!!.start()
}

Related

How to loop background audio without gap?

The app plays some background audio (wav) in an endless loop. Audio playback was flawless using just_audio package, but after switching to audioplayers, all looped audio now contain a short gap of silence before they restart at the beginning. Question now is: How to get rid of the gap?
The code to load and play the files is quite simple:
Future<void> loadAmbient() async {
_audioAmbient = AudioPlayer();
await _audioAmbient!.setSource(AssetSource('audio/ambient.wav'));
await _audioAmbient!.setReleaseMode(ReleaseMode.loop);
}
void playAmbient() async {
if (PlayerState.playing != _audioAmbient?.state) {
await _audioAmbient?.seek(Duration.zero);
_audioAmbient?.resume();
}
}
According to the audioplayers docs, this seems to be a known issue:
The Getting Started docs state:
Note: there are caveats when looping audio without gaps. Depending on the file format and platform, when audioplayers uses the native implementation of the "looping" feature, there will be gaps between plays, witch might not be noticeable for non-continuous SFX but will definitely be noticeable for looping songs. Please check out the Gapless Loop section on our Troubleshooting Guide for more details.
Following the link to the [Trouble Shooting Guide] (https://github.com/bluefireteam/audioplayers/blob/main/troubleshooting.md) reveals:
Gapless Looping
Depending on the file format and platform, when audioplayers uses the native implementation of the "looping" feature, there will be gaps between plays, witch might not be noticeable for non-continuous SFX but will definitely be noticeable for looping songs.
TODO(luan): break down alternatives here, low latency mode, audio pool, gapless_audioplayer, ocarina, etc
Interesting is the Depending on the file format and platform, ... part, which sounds as if gap-less loops could somehow be doable. Question is, how?
Any ideas how to get audio to loop flawless (i.e. without gap) are very much appreciated. Thank you.
You can pass this parameter
_audioAmbient = AudioPlayer(mode: PlayerMode.LOW_LATENCY);
if not solved u can try something new like https://pub.dev/packages/ocarina
I fixed it this way:
Created 2 players, set asset path to both of them and created the method that starts second player before first is completed
import 'package:just_audio/just_audio.dart' as audio;
audio.AudioPlayer player1 = audio.AudioPlayer();
audio.AudioPlayer player2 = audio.AudioPlayer();
bool isFirstPlayerActive = true;
StreamSubscription? subscription;
audio.AudioPlayer getActivePlayer() {
return isFirstPlayerActive ? player1 : player2;
}
void setPlayerAsset(String asset) async {
if (isFirstPlayerActive) {
await player1.setAsset("assets/$asset");
await player2.setAsset("assets/$asset");
} else {
await player2.setAsset("assets/$asset");
await player1.setAsset("assets/$asset");
}
subscription?.cancel();
_loopPlayer(getActivePlayer(), isFirstPlayerActive ? player2 : player1);
}
audio.AudioPlayer getActivePlayer() {
return isFirstPlayerActive ? player1 : player2;
}
void _loopPlayer(audio.AudioPlayer player1, audio.AudioPlayer player2) async {
Duration? duration = await player1.durationFuture;
subscription = player1.positionStream.listen((event) {
if (duration != null &&
event.inMilliseconds >= duration.inMilliseconds - 110) {
log('finished');
player2.play();
isFirstPlayerActive = !isFirstPlayerActive;
StreamSubscription? tempSubscription;
tempSubscription = player1.playerStateStream.listen((event) {
if (event.processingState == audio.ProcessingState.completed) {
log('completed');
player1.seek(const Duration(seconds: 0));
player1.pause();
tempSubscription?.cancel();
}
});
subscription?.cancel();
_loopPlayer(player2, player1);
}
});
}

How can I catch a stream from Webview and change it to VOICE_CALL Stream with AEC?

I am using Agora, and it has some issues. One of them is the speaker's voice comes out to the media sound.
On the browser, it can't control the media volume, So, I created an app to handle this. In the app, I dispatch the volume up/down button to control media volume.
However, this method created howling issue. So, I'd like to send the sound to STREAM_VOICE_CALL and use AEC(Acoustic Echo Cancellation) API on Android so that the sound comes out to the right stream and it can handle the echo problem.
what I wrote,
private fun enableVoiceCallMode() {
with(audioManager) {
volumeControlStream = AudioManager.STREAM_VOICE_CALL
setStreamVolume(
AudioManager.STREAM_VOICE_CALL,
audioManager.getStreamVolume(AudioManager.STREAM_VOICE_CALL),
0
)
}
}
But this didn't work.
And also, I tried to apply AEC like this:
private fun enableEchoCanceler() {
if (AcousticEchoCanceler.isAvailable() && aec == null) {
aec = AcousticEchoCanceler.create(audioManager.generateAudioSessionId())
aec?.enabled = true
} else {
aec!!.enabled = false
aec!!.release()
aec = null
}
}
private fun releaseEchoCanceler() {
aec!!.enabled = false
aec?.release()
aec = null
}
However, I don't know if AcousticEchoCanceler.create(audioManager.generateAudioSessionId()) is correct way or not.
please help me out.

Glide slow loading after Exoplayer Cache enabled

first time writing here on stackoverflow.
(You bet I'm a noob in Android development)
I've been experimenting with a quasi-Spotify clone app that has a Recyclerview showing song thumbnails & load mp3s from Firestore db via URL. The app is displaying images using Glide and plays mp3 using ExoPlayer.
All is working fine except the loading of images (about 12 of them currently) got a bit slower after I've enabled ExoPlayer to play using Cache. Before implementing Cache for ExoPlayer Glide displayed image immediately upon launch (less than 1 second) but after using Cache for ExoPlayer it takes about 3~4 seconds to display 6~7 rows of Recyclerview.
ExoPlayer prep BEFORE using cacheDataSoruce
private fun prepPlayerNormal(url: String?) { // NORMAL
val uri = Uri.parse(url)
val localMp3Uri = RawResourceDataSource.buildRawResourceUri(R.raw.rocounty_demo)
val mediaItem = MediaItem.fromUri(uri)
val mediaSource = ProgressiveMediaSource.Factory(DefaultDataSourceFactory(receivedContext, dataSourceFactory), DefaultExtractorsFactory()).createMediaSource(mediaItem)
exoPlayer.setMediaSource(mediaSource)
exoPlayer.prepare()
exoPlayer.playWhenReady = true
}
ExoPlayer prep AFTER using cacheDataSoruce
private fun prepPlayerWithCache(url:String?) {
val mp3Uri = Uri.parse(url)
val mediaItem = MediaItem.fromUri(mp3Uri)
val mediaSource = ProgressiveMediaSource.Factory(cacheDataSourceFactory).createMediaSource(mediaItem)
exoPlayer.setMediaSource(mediaSource, true)
exoPlayer.prepare()
exoPlayer.playWhenReady = true
}
And this is my Caching helper class (called from MainActivity inside OnCreate()):
class MyCacher(private val receivedContext: Context, private val cacheDir: File, private val mpInstanceReceived: MyMediaPlayer ) {
companion object {
var simpleCache: SimpleCache? = null
var leastRecentlyUsedCacheEvictor: LeastRecentlyUsedCacheEvictor? = null
var exoDatabaseProvider: ExoDatabaseProvider? = null
var exoPlayerCacheSize: Long = 90 * 1024 * 256
}
fun initCacheVariables() {
if (leastRecentlyUsedCacheEvictor == null) {
leastRecentlyUsedCacheEvictor = LeastRecentlyUsedCacheEvictor(exoPlayerCacheSize)
Log.d(TAG, "initCacheVariables: inside leastRecentlyUsed....")
}
if (exoDatabaseProvider == null) {
exoDatabaseProvider = ExoDatabaseProvider(receivedContext)
Log.d(TAG, "initCacheVariables: inside exoDatabaseProvider ... ")
}
if (simpleCache == null) {
simpleCache = SimpleCache(cacheDir, leastRecentlyUsedCacheEvictor!!, exoDatabaseProvider!!)
Log.d(TAG, "initCacheVariables: inside simpleCache..")
}
mpInstanceReceived.initExoPlayerWithCache()
}
}
As far as I understand, ExoPlayer's caching uses RAM while Glide reads from cache written on the Disk. How both are affected by each other is a mystery to me.
I've searched a forum but found no related topics so far.
SUMMARY: After executing ExoPlayer to stream mp3 with CacheDataSource, Glide's loading speed got way slower (2-3 seconds delay to display thumbnail size image on 6 to 7 rows of Recycler view)
QUESTION1: How is ExoPlayer's play from cache affecting my Glide loading speed?
QUESTION2: Is this normal? What should I do to revert my Glide to have its previous loading speed.
Note) Size of the image files on Firestore are range from 200kb-1.0MB (some files are intentionally large for testing purposes)
Android Studio 4.2/Kotlin
Test emulator: NEXUS 5X API 25
Thanks so much in advance!
After struggling (and excessive searching) for about a week, I've found a solution to this. It turns out that ExoPlayer and Glide were sharing the same folder and SimpleCache's constructor was deleting Glide's cache as they were unrecognized files.
You can solve this issue by using a different location for your ExoPlayer's cache (or adding a subfolder)
Source link: https://github.com/bumptech/glide/issues/4429#issuecomment-737870711

How to set sound playback from external speaker?

There is a kind of weird issue, I am using oboe lib https://github.com/google/oboe, for sound playback. Of course you can choose sound playback output according to android settings
https://developer.android.com/reference/android/media/AudioDeviceInfo
So, if I need to set exact output chanel I need to set it to oboe lib.
By the way output chanal that I need is TYPE_BUILTIN_SPEAKER, but on some devices (sometimes, not constantly) I hear the sound from
TYPE_BUILTIN_EARPIECE
How I am doing this, I have such method to get needed chanel id
fun findAudioDevice(app: Application,
deviceFlag: Int,
deviceType: Int): AudioDeviceInfo?
{
var result: AudioDeviceInfo? = null
val manager = app.getSystemService(Context.AUDIO_SERVICE) as AudioManager
val adis = manager.getDevices(deviceFlag)
for (adi in adis)
{
if (adi.type == deviceType)
{
result = adi
break
}
}
return result
}
How I use it
val id = getAudioDeviceInfoId(getBuildInSpeakerInfo())
private fun getBuildInSpeakerInfo(): AudioDeviceInfo?
{
return com.tetavi.ar.basedomain.utils.Utils.findAudioDevice( //
getApplication<Application>(), //
AudioManager.GET_DEVICES_OUTPUTS, //
AudioDeviceInfo.TYPE_BUILTIN_SPEAKER //
)
}
private fun getAudioDeviceInfoId(info: AudioDeviceInfo?): Int
{
var result = -1
if (info != null)
{
result = info.id
}
return result
}
And eventually I need to set this id to oboe lib. Oboe lib is native lib, so with JNI I pass this id and set it
oboe::Result oboe_engine::createPlaybackStream()
{
oboe::AudioStreamBuilder builder;
const oboe::SharingMode sharingMode = oboe::SharingMode::Exclusive;
const int32_t sampleRate = mBackingTrack->getSampleRate();
const oboe::AudioFormat audioFormat = oboe::AudioFormat::Float;
const oboe::PerformanceMode performanceMode = oboe::PerformanceMode::PowerSaving;
builder.setSharingMode(sharingMode)
->setPerformanceMode(performanceMode)
->setFormat(audioFormat)
->setCallback(this)
->setSampleRate(sampleRate);
if (m_output_playback_chanel_id != EMPTY_NUM)
{
//set output playback chanel (like internal or external speaker)
builder.setDeviceId(m_output_playback_chanel_id); <------------- THIS LINE
}
return builder.openStream(&mAudioStream);
}
So, actually issue is that on some devices (sometimes, not constantly) I still hear that sound playback goes from internal speaker TYPE_BUILTIN_EARPIECE inspite of I set directly that I need to use TYPE_BUILTIN_SPEAKER
I checked a few times the flow, from moment that I get this id (it is acctually is 3) and up to the moment when I set it as a param to oboe lib, but still sometimes I hear sound from internal speaker.
So, question is - if I miss something here? Maybe some trick should be implemented or something else?

PlatformChannel Already open error when I try to cast a second video

I have a receiver app (V2) that works fine when you show the first video, but when you go to show a second video I get this:
[cast.receiver.platform.WebSocket] PlatformChannel Already open
I am unloading and loading the player each time. I can't see any way to explicitly ask the PlatformChannel to close. Here's the relevant code from the function that starts play:
this.receiverManager.start()
this.host = new cast.player.api.Host({'mediaElement':this.refs.video, 'url':source})
this.host.onError = function(errorCode) {
console.log("Fatal Error - " + errorCode)
if (window.player) {
window.player.unload()
window.player = null
}
}
this.host.updateSegmentRequestInfo = function(requestInfo) {
requestInfo.withCredentials = false;
}
if(!window.player) {
window.player = new cast.player.api.Player(this.host)
}
this.receiverManager.setApplicationState('Ready To Cast');
this.protocol = cast.player.api.CreateDashStreamingProtocol(this.host)
window.player.load(this.protocol, 0)
We highly recommend that you move to a CAF receiver. Also, CAF has a new queueing API that will handle a playlist of videos.

Categories

Resources