how to play .mov format video file in android application - android

I am developing an android application which plays videos from our server uploaded by both iphone and android, and the problem is that the videos recorded by iphone is in .mov format, which is not supported by android. I searched a lot, but couldn't found any solution. Help me out. Thanks in advance :)

I will recomend you to change the format from the iphones to a compatible (even open will be better), because 2 differents formats of videos on server soon or later will be a headache.
If you could, reformat your videos with ffmpeg and save it all with the same format on server.
If you can't or it's really hard to achieve, you could try the ExoPlayer component from Google.
I have tried on an app like you, where devices (iPhone and Android) record videos and upload to the server.
Reformatting all this video on server side will be almost impossible so we endly make the decission to apply a solution on client side for legacy videos.
https://google.github.io/ExoPlayer/
The setup from ExoPlayer it's larger than VideoView, but it's simple to be done.
private var player: ExoPlayer = initPlayer()
private fun initPlayer(): ExoPlayer {
val bandwidthMeter = DefaultBandwidthMeter()
val videoTrackSelectionFactory = AdaptiveTrackSelection.Factory(bandwidthMeter)
val trackSelector = DefaultTrackSelector(videoTrackSelectionFactory)
return ExoPlayerFactory.newSimpleInstance(context, trackSelector)
}
fun setup() {
videoExo.setPlayer(player)
videoExo.useController = false
val dataSourceFactory = DefaultDataSourceFactory(
context,
Util.getUserAgent(context, context?.packageName), DefaultBandwidthMeter()
)
val videoSource = ExtractorMediaSource.Factory(dataSourceFactory).createMediaSource(videoUri)
player.addListener(object: Player.EventListener {
override fun onPlayerStateChanged(playWhenReady: Boolean, playbackState: Int) {
if (playbackState == Player.STATE_READY) {
startCallback()
}
}
override fun onPlaybackParametersChanged(playbackParameters: PlaybackParameters?) {}
override fun onSeekProcessed() {}
override fun onTracksChanged(trackGroups: TrackGroupArray?, trackSelections: TrackSelectionArray?) {}
override fun onPlayerError(error: ExoPlaybackException?) {}
override fun onLoadingChanged(isLoading: Boolean) {}
override fun onPositionDiscontinuity(reason: Int) {}
override fun onRepeatModeChanged(repeatMode: Int) {}
override fun onShuffleModeEnabledChanged(shuffleModeEnabled: Boolean) {}
override fun onTimelineChanged(timeline: Timeline?, manifest: Any?, reason: Int) {}
})
player.prepare(videoSource)
player.playWhenReady = true
player.repeatMode = REPEAT_MODE_ALL
}
Hope this helps

Related

Is there a way to continously record audio in background with a service on Android with Kotlin?

Im building an app that translate audio in background on live. I display a overlay over apps and i want to record the audio to translate in real time. But in android 13 the audio recording stops after a few secods recording.
I guess that the problem is for the security of the user, but im looking for a second opinion
Im using the SpeechRecognizer library. But I receive recommendations
Added a piece of code
AndroidManifiest.xml
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<service
android:name=".feature.overlay.OverlayService"
android:exported="false"
android:foregroundServiceType="microphone"/>
OverlayService
override fun onCreate() {
super.onCreate()
windowManager = getSystemService(Context.WINDOW_SERVICE) as WindowManager
configureSpeechToText()
windowManager.addView(layoutText,params)
}
fun configureSpeechToText() {
speechRecognizer = SpeechRecognizer.createSpeechRecognizer(this)
speechRecognizerIntent.putExtra(
RecognizerIntent.EXTRA_LANGUAGE_MODEL,
RecognizerIntent.LANGUAGE_MODEL_FREE_FORM
)
speechRecognizerIntent.putExtra(RecognizerIntent.EXTRA_LANGUAGE, Locale.getDefault())
speechRecognizer?.setRecognitionListener(object : RecognitionListener {
override fun onReadyForSpeech(bundle: Bundle) {}
override fun onBeginningOfSpeech() {}
override fun onRmsChanged(v: Float) {}
override fun onBufferReceived(bytes: ByteArray) {}
override fun onEndOfSpeech() {}
override fun onError(i: Int) {
val errorMessage = getErrorText(i)
Log.d("Error", "FAILED: $errorMessage")
}
override fun onResults(bundle: Bundle) {
val data = bundle.getStringArrayList(SpeechRecognizer.RESULTS_RECOGNITION)
dataVoiceText = data!![0]
run(dataVoiceText)//The function that translate the text
}
override fun onPartialResults(bundle: Bundle) {}
override fun onEvent(i: Int, bundle: Bundle) {}
})
handler.postDelayed(Runnable {//Run this code every few seconds to translate every few seconds
handler.postDelayed(runnable!!, delay.toLong())
speechRecognizer?.stopListening()
speechRecognizer?.startListening(speechRecognizerIntent)
}.also { runnable = it }, delay.toLong())
}
Im trying to record audio in background with a service, im expecting to it continously record the audio. but right now it is stoping

Is ExoPlayer’s audio offload mode supported while using MediaSessionService from Media3?

I am working on a native music player app for android using ExoPlayer and MediaSessionService from Media3. Now I want to make playback more energy efficient while the screen is off by using experimentalSetOffloadSchedulingEnabled, but it seems like I’m not able to get the offloading to work.
From the main activity of the app I send ACTION_START_AUDIO_OFFLOAD in the onStop() method to my service (the relevant parts of the service are show below), and ACTION_STOP_AUDIO_OFFLOAD in the onStart() method. In this way I have been able to get correct true/false responses from the onExperimentalOffloadSchedulingEnabledChanged listener, but I do not get any responses from the onExperimentalOffloadedPlayback or onExperimentalSleepingForOffloadChanged listeners, so it seems like the player never enters power saving mode.
My tests were made with Media3 version 1.0.0-beta03 on Android 13 (emulator) and Android 10 (phone) using MP3 files. I am aware that Media3 is in beta and that the offload scheduling method is experimental, but I'm not sure if that is the limitation or if I have done something wrong. Any ideas what could be the issue?
#androidx.media3.common.util.UnstableApi
class PlaybackService: MediaSessionService(), MediaSession.Callback {
private val listener = object : ExoPlayer.AudioOffloadListener {
override fun onExperimentalOffloadSchedulingEnabledChanged(offloadSchedulingEnabled: Boolean) {
Log.d("PlaybackService","offloadSchedulingEnabled: $offloadSchedulingEnabled")
super.onExperimentalOffloadSchedulingEnabledChanged(offloadSchedulingEnabled)
}
override fun onExperimentalOffloadedPlayback(offloadedPlayback: Boolean) {
Log.d("PlaybackService","offloadedPlayback: $offloadedPlayback")
super.onExperimentalOffloadedPlayback(offloadedPlayback)
}
override fun onExperimentalSleepingForOffloadChanged(sleepingForOffload: Boolean) {
Log.d("PlaybackService","sleepingForOffload: $sleepingForOffload")
super.onExperimentalSleepingForOffloadChanged(sleepingForOffload)
}
}
private lateinit var player: ExoPlayer
private var mediaSession: MediaSession? = null
override fun onCreate() {
super.onCreate()
player = ExoPlayer.Builder(
this,
DefaultRenderersFactory(this)
.setEnableAudioOffload(true)
)
.setAudioAttributes(AudioAttributes.DEFAULT, /* handleAudioFocus = */ true)
.setHandleAudioBecomingNoisy(true)
.setSeekBackIncrementMs(10_000)
.setSeekForwardIncrementMs(10_000)
.setWakeMode(C.WAKE_MODE_LOCAL)
.build()
player.addAudioOffloadListener(listener)
mediaSession = MediaSession
.Builder(this, player)
.setCallback(this)
.build()
}
override fun onGetSession(controllerInfo: MediaSession.ControllerInfo): MediaSession? =
mediaSession
override fun onStartCommand(intent: Intent?, flags: Int, startId: Int): Int {
when(intent?.action) {
ACTION_START_AUDIO_OFFLOAD -> startAudioOffload()
ACTION_STOP_AUDIO_OFFLOAD -> stopAudioOffload()
}
return super.onStartCommand(intent, flags, startId)
}
private fun startAudioOffload() {
player.experimentalSetOffloadSchedulingEnabled(true)
}
private fun stopAudioOffload() {
player.experimentalSetOffloadSchedulingEnabled(false)
}
override fun onDestroy() {
mediaSession?.run {
player.release()
release()
mediaSession = null
}
super.onDestroy()
}
companion object {
const val ACTION_START_AUDIO_OFFLOAD = "ACTION_START_AUDIO_OFFLOAD"
const val ACTION_STOP_AUDIO_OFFLOAD = "ACTION_STOP_AUDIO_OFFLOAD"
}
}

Android YouTube API Code only loads one video

private lateinit var youtubePLayerInit: YouTubePlayer.OnInitializedListener
private fun openTrailer(videoId: String) {
val dialog = Dialog(this)
val trailerDialogBinding = DialogTrailerBinding.inflate(layoutInflater)
dialog.setContentView(trailerDialogBinding.root)
dialog.setCanceledOnTouchOutside(false)
val youtubeApiKey = "xxxxx"
youtubePLayerInit = object: YouTubePlayer.OnInitializedListener {
override fun onInitializationSuccess(
p0: YouTubePlayer.Provider?,
p1: YouTubePlayer?,
p2: Boolean
) {
p1?.loadVideo(videoId)
}
override fun onInitializationFailure(
p0: YouTubePlayer.Provider?,
p1: YouTubeInitializationResult?
) {
Toast.makeText(applicationContext, "Video Failed to Load", Toast.LENGTH_SHORT).show()
}
}
trailerDialogBinding.vvMovieTrailer.initialize(youtubeApiKey, youtubePLayerInit)
dialog.show()
trailerDialogBinding.btnExit.setOnClickListener {
dialog.dismiss()
}
}
Thing is that video loads only one time, when I reload the video then dismissing the dialog. It doesn't load, then after like 3 or 4 times of loading, the app crashes with the errors of :
E/YouTubeAndroidPlayerAPI: Embed config is not supported in RemoteEmbeddedPlayer.
E/YouTubeAndroidPlayerAPI: Error screen presenter should be present
I call this video about 3 times, but only the video only loads one time. How can I fix this?

How to detect android speech is in offline or online

I follow this question to do the offline speech on android.
I downloaded the language in google voice and it can work in offline.
The problem is that I want to know it's current running on offline or online speech, (just like Apple speech to text, there is an api to check for that) to display the speech stream in my app correctly
I wonder is there anyway to do that?
Here is my code:
val intentSpeech = Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH)
intentSpeech.putExtra(RecognizerIntent.EXTRA_LANGUAGE_MODEL, "en-US")
intentSpeech = Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH)
intentSpeech.putExtra(RecognizerIntent.EXTRA_PARTIAL_RESULTS, true);
intentSpeech.putExtra(RecognizerIntent.EXTRA_LANGUAGE_MODEL,
RecognizerIntent.LANGUAGE_MODEL_FREE_FORM)
intentSpeech.putExtra(RecognizerIntent.EXTRA_CALLING_PACKAGE,
packageName)
val recognizer = SpeechRecognizer.createSpeechRecognizer(this)
recognizer.setRecognitionListener(this)
P/s: I can see the Read Along app by google works perfectly in offline or online mode.
I'm trying to do the same with the android speech api. Is it possible?
For offline speech to text, you can use Google's default STT model but it seems to be non-continuous.
private fun startSpeechToText() {
val speechRecognizer = SpeechRecognizer.createSpeechRecognizer(this)
val speechRecognizerIntent = Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH)
speechRecognizerIntent.putExtra(
RecognizerIntent.EXTRA_LANGUAGE_MODEL,
RecognizerIntent.LANGUAGE_MODEL_FREE_FORM
)
speechRecognizerIntent.putExtra(RecognizerIntent.EXTRA_LANGUAGE, Locale.getDefault())
speechRecognizer.setRecognitionListener(object : RecognitionListener {
override fun onReadyForSpeech(bundle: Bundle?) {}
override fun onBeginningOfSpeech() {}
override fun onRmsChanged(v: Float) {}
override fun onBufferReceived(bytes: ByteArray?) {}
override fun onEndOfSpeech() {}
override fun onError(i: Int) {}
override fun onResults(bundle: Bundle) {
val result = bundle.getStringArrayList(SpeechRecognizer.RESULTS_RECOGNITION)
if (result != null) {
// result[0] will give the output of speech
}
}
override fun onPartialResults(bundle: Bundle) {}
override fun onEvent(i: Int, bundle: Bundle?) {}
})
// starts listening ...
speechRecognizer.startListening(speechRecognizerIntent)
}
If you don't want to use google as it requires to download the offline model for speech to text.
The other option for offline STT is the Vosk API as it has pretrained models in english and other language for live STT.
https://github.com/alphacep/vosk-android-demo
Reference: https://www.geeksforgeeks.org/offline-speech-to-text-without-any-popup-dialog-in-android/

Attach artwork to mp3 Uri for ExoPlayer

I'm using the more-or-less default ExoPlayer 2.1 SimpleExoPlayer to stream from either a url Uri or a local file Uri. The Uri is loaded into the MediaSource,
MediaSource mediaSource = new ExtractorMediaSource(url,
dataSourceFactory, extractorsFactory, null, null);
The source is sometimes a Video, mp4 instead of mp3, so in the Activity I set it to the com.google.android.exoplayer2.ui.SimpleExoPlayerView
mVideoView = (SimpleExoPlayerView) findViewById(R.id.media_player);
mVideoView.setPlayer(player);
mVideoView.requestFocus();
I've read in an article:
SimpleExoPlayerView has also been updated to look at ID3 metadata, and will automatically display embedded ID3 album art during audio playbacks. If not desired, this functionality can be disabled using SimpleExoPlayerView’s setUseArtwork method.
I've seen it answered for Files, How to get and set (change) ID3 tag (metadata) of audio files?,
But I'm hoping to set the ID3 metadata for a Uri derived from a url String. Is it possible? Otherwise, is it possible to set the artwork for an ExoPlayerView without editing the ID3 metafiles? Or possible to change the ID3 meta for a File without a dependency?
Edit
So I've found an Issue which says this is solved, and the issuer linked to an override of the exo_simple_player_view here
I've found in the blog post
When a SimpleExoPlayerView is instantiated it inflates its layout from the layout file exo_simple_player_view.xml. PlaybackControlView inflates its layout from exo_playback_control_view.xml. To customize these layouts, an application can define layout files with the same names in its own res/layout* directories. These layout files override the ones provided by the ExoPlayer library.
So I have to override the simpleview somehow.
You can use this function and pass mediaUri and thunbnailUri to it
private fun PlayerView.loadArtWorkIfMp3(mediaUri: Uri, thumbnailUri: Uri) {
try {
val imageView = this.findViewById<ImageView>(R.id.exo_artwork)
if (mediaUri.lastPathSegment!!.contains("mp3")) {
this.useArtwork = true
imageView.scaleType = ImageView.ScaleType.CENTER_INSIDE
imageView.loadImage(thumbnailUri) {
this.defaultArtwork = it
}
}
} catch (e: Exception) {
Log.d("artwork", "exo_artwork not found")
}
}
and You will use loadImage function
#BindingAdapter(value = ["imageUri", "successCallback"], requireAll = false)
fun ImageView.loadImage(imgUrl: Uri?, onLoadSuccess: (resource: Drawable) -> Unit = {}) {
val requestOption = RequestOptions()
.placeholder(R.drawable.ic_music)
.error(R.drawable.ic_music)
Glide.with(this.context)
.load(imgUrl)
.transition(DrawableTransitionOptions.withCrossFade())
.apply(requestOption)
.centerInside()
.diskCacheStrategy(DiskCacheStrategy.AUTOMATIC)
.listener(GlideImageRequestListener(object : GlideImageRequestListener.Callback {
override fun onFailure(message: String?) {
Log.d("loadImage", "onFailure:-> $message")
}
override fun onSuccess(dataSource: String, resource: Drawable) {
Log.d("loadImage", "onSuccess:-> load from $dataSource")
onLoadSuccess(resource)
}
}))
.into(this)
}
And last one
class GlideImageRequestListener(private val callback: Callback? = null) : RequestListener<Drawable> {
interface Callback {
fun onFailure(message: String?)
fun onSuccess(dataSource: String, resource: Drawable)
}
override fun onLoadFailed(
e: GlideException?,
model: Any?,
target: Target<Drawable>?,
isFirstResource: Boolean
): Boolean {
callback?.onFailure(e?.message)
return false
}
override fun onResourceReady(
resource: Drawable?,
model: Any?,
target: Target<Drawable>?,
dataSource: DataSource?,
isFirstResource: Boolean
): Boolean {
resource?.let {
target?.onResourceReady(
it,
DrawableCrossFadeTransition(1000, isFirstResource)
)
}
callback?.onSuccess(dataSource.toString(),resource!!)
return true
}
}

Categories

Resources