Android Media3 Session & Controller - Playback with merging media source - android

I'm trying to use a media session in my app to make background playback and notifications and all that work. The problem is that the documentation isn't very good and there's missing features. The main feature which was causing me issues is not being able to set media items in the media controller. I found this other post asking about the same problem Android Media3 Session & Controller - Playback not starting, and the solution does work. However, that's only for a singular URI. I need to be able to play audio and video at the same time somehow. Can I make it work somehow or am I just doomed to only have a media app that cant play audio?
I would've asked in a comment on the solution, but there's a requirement for 50 rep... which is stupid.

Related

Xamarin Forms allow music from another app on Android while playing Video

I am facing the issue, that my App Blocks Background Music when using a MediaElement for Video usage.
The bottom line is that I want to play a silent video and the user should still be able to listen to spotify and etc. in the background
In iOS the procedure to allow Background Music is pretty straight forward. On Android I cannot manage to get it working.
I already tried different xamarin.forms libraries where i set the video to mute or the volume to zero.
But nothing seemed to work. I tried for example octane, mediamanager and several other videoplayers.
For this reason, I would now like to try to implement it in the native Android code.
But I haven't found anything about it in various forums.
Xamarin Forms Version is 4.8
You could use a service to stream audio to play audio while your application is not visible or under the lock screen.
You could check the blog about this and download the source file from the Github. https://devblogs.microsoft.com/xamarin/background-audio-streaming-with-xamarin-android/
I test with the sample in the thread i done before to play the audio. libVLCsharp.forms can not play Android resource video file while iOS can
When i make the app which play music in the background, the music would play as well.

ExoPlayer playing currently recording media files

Let me refraise my question, I wrote it in a hurry.
Current situation:
I have set up a digital video recorder to record broadcasts provided via DVB-C. It is running on a raspberry 3B using TVHeadend and jetty/cling to provide UPnP and other possibilities to access media files. For watching recordings, I wrote an android player app using IJKPlayer, which runs on smartphones, FireTV and AndroidTV.
One hassle when playing media files which are currently beeing recorded is, that IJKPlayer doesn not support timeshifting. Means, when I start playing a currently recording file, I can only watch the length which is known by the player at that moment. Anything which is recorded afterwards can not be played. I need to exit the player activity and start it again. I have resolved that issue by "simulating" a completed recoding using a custom servlet implementation. Since the complete length of the recording is already known, I can use ffmpeg to accomplish this.
Future situation:
I plan to move away from IJKPlayer to ExoPlayer, because it supports hardware playback and is much faster when playing h.264 media. I can of course use the same solution like above, but as far as I have found out yet, ExoPlayer can support media files which are currently being recorded by using the Timeline class. However, I don't seem to find neither a usefull documentation nor any good example. Hence, I would appreciate any help with the timeline object.
Regards
Harry
Looks like my approach won't work. At least, I didn't find a solution. Problem is, that the server returns the stream size as it is during player-start-time. I didn't find a method to update the media duration for "regular" files.
However, I can solve the problem by changing the server side. Instead of accessing a regular file, I convert the file to m3u8 in realtime, using ffmpeg. I then throw the m3u8 URI onto the player and it updates the duration of the stream (while playing) without the need to create any additional code on the client side.

Android Service to detect video playback on a third party Video Player

I need to develop an Android Service (API level > 21), which should be able to detect video playback on a third party Video player, so that it can further disable the Notification pop-ups during a video playback.
On searching, I found several posts talking about detecting the play-states of videos in a VideoView, such as the following:
How to detect when VideoView starts playing (Android)?
But, I couldn't find anything that can help detecting video playback on a video player, wherein, the video player is installed as a third party app and is not a part of the Android application/ service.
So, the challenges are:
1) The android service needs to detect a video playback
2) Registering package names for video player apps with the Service is out of question, since the Service should be able to detect the playback even when a new video player is installed from the Playstore.
3) Need to extend this idea for online video streaming as well.
Any help in this regard would be really appreciated.
Thanks in advance.
There is no way to implement 100% reliable video playback detection :(
Just and idea: try listening to the audio focus changes... this might help with detection of video playback, assuming the target player respects the audio output stream usage policy, recommended by Google. You might also want to get notified about system UI visibility changes (to detect fullscreen playback).
Audio focus:
https://developer.android.com/guide/topics/media-apps/volume-and-earphones.html
System UI visibility changes:
https://developer.android.com/reference/android/view/View.OnSystemUiVisibilityChangeListener.html
Also, you are correct saying that package name based detector is not an option since detecting active/running package is no longer possible on android 7.

What are the differences between MediaPlayer, MediaSessionCompat and RemotePlaybackClient

I have been searching for days but I am still not able to understand how exactly these differ in functionality and what role does each of them play in Media Playback? It would be very helpful if someone can explain the difference.
how exactly these differ in functionality
That is akin to asking how a shovel, a hammer, and a piece of rope differ in functionality. While all can be considered tools, they are not really replacements for one another in most use cases.
what role does each of them play in Media Playback?
MediaPlayer plays media on the Android device (audio and video, from local or streaming sources).
RemotePlaybackClient directs some other piece of hardware to play media. The classic example of this is using RemotePlaybackClient to tell a Chromecast to play a video.
While I have not dealt with MediaSession (or MediaSessionCompat), it appears to tie your media playback logic with media controllers that live outside your app, such as a Notification.MediaStyle notification (to control media playback from the Android 5.0+ lockscreen), Android Auto, etc.

Simple ongoing notification to allow audio to continue playing in the background

I've decided to try to make a little Android application that streams live audio from my server. I'm using a HTML5 audio tag for the audio. But, on some devices when the user goes to the browser or their messages the music stops playing. Would I need to create an ongoing notification so the audio would continue to play? Or, would something else be better? I'm using PhoneGap too, I don't know if there's a plug-in out there that I've missed that could solve this. I'm trying to learn all of this on my own, but I'm just confused at this point.
I think it ended up being a problem with the app manager just closing the app.

Categories

Resources