I do have a HLS livestream which is epresented by an URI pointing to the manifest file.
The manifest file does additionally define the subtitles.
When using the ExoPlayer I can use a TrackSelector, attach it to the ExoPlayer and so I do have the option to show the available subtitles to the user (and change it through the TrackSelector).
I want to do the same with just the CastPlayer. Here the simplest possible Activity I can imagine:
public class PlayerActivity extends AppCompatActivity implements SessionAvailabilityListener {
private MediaRouteButton castButton;
private CastPlayer castPlayer;
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.player_activity);
castButton = findViewById(R.id.castButton);
CastButtonFactory.setUpMediaRouteButton(this, castButton);
CastContext castContext = CastContext.getSharedInstance(this);
castPlayer = new CastPlayer(castContext);
castPlayer.setSessionAvailabilityListener(this);
}
#Override
public void onCastSessionAvailable() {
MediaMetadata movieMetadata = new MediaMetadata(MediaMetadata.MEDIA_TYPE_MOVIE);
MediaInfo mediaInfo = new MediaInfo.Builder("https://myuri.ch/video.m3u8")
.setStreamType(MediaInfo.STREAM_TYPE_BUFFERED)
.setMetadata(movieMetadata)
.build();
MediaQueueItem mediaItem = new MediaQueueItem.Builder(mediaInfo).build();
castPlayer.loadItems(new MediaQueueItem[]{mediaItem}, 0, 0, Player.REPEAT_MODE_OFF);
}
#Override
public void onCastSessionUnavailable() {
}
}
The layout looks like this:
<androidx.mediarouter.app.MediaRouteButton
android:id="#+id/castButton"
android:layout_alignParentEnd="true"
android:layout_alignParentTop="true"
android:layout_width="wrap_content"
android:layout_height="wrap_content">
</androidx.mediarouter.app.MediaRouteButton>
This activity starts up and when I hit the castButton it streams the movie behind "https://myuri.ch/video.m3u8" to the cast-device. Works perfectly :)
But I can't figure out how to allow the user to choose between subtitles from my app. What is the idea on how we should implement something like this?
Points I already found out:
I cannot attach something like a TrackSelector to the CastPlayer
The docuentation states that we can provide MediaTracks to the MediaInfo-Object. But I dont have this info, respectively it is hidden in the m3u8 file.
As additional info, my CastOptions look like this:
public class CastOptionsProvider implements OptionsProvider {
#Override
public CastOptions getCastOptions(Context appContext) {
return new CastOptions.Builder()
.setReceiverApplicationId(CastMediaControlIntent.DEFAULT_MEDIA_RECEIVER_APPLICATION_ID)
.build();
}
#Override
public List<SessionProvider> getAdditionalSessionProviders(Context context) {
return null;
}
}
Here what I found out:
The TrackSelector cannot be used for the CastPlayer.
Alle the available subtitles have to be available/known before you load MediaItems to the CastPlayer
The DefaultMediaReceiver needs a public available link to a *.vtt file. Something different does not work (no .webvtt nor .m3u8)
The easiest way to update the subtitles on the ExoPlayer seems to be this:
DefaultTrackSelector.ParametersBuilder parametersBuilder = trackSelector.getParameters()
.buildUpon()
.setRendererDisabled(getRendererIndex(C.TRACK_TYPE_TEXT), !hasSubtitles);
if (hasSubtitles) {
parametersBuilder.setPreferredTextLanguage("en").build();
}
Where getRendererIndex is this:
public int getRendererIndex(final int trackType) {
for (int t = 0; t < exoPlayer.getRendererCount(); t++) {
if (exoPlayer.getRendererType(t) == trackType) {
return t;
}
}
return -1;
}
To select a SubTitle on the CastPlayer you can use the RemoteMediaClient
remoteMediaClient.setActiveMediaTracks(selectedSubtitleTracks);
So my solution is this:
Before starting any stream (local or cast) get all the information I need for the MediaItem (url, subtitles, ...)
Depending on the availability of a Cast-Session I load the MediaItems to the CastPlayer or the ExoPlayer
A Custom subtitle-dialog gives the user the option to select a subtitle
When the user selects a subtitle, I use the methods described above to select the subtitle on the correct player.
Related
I am working on a radio app . I used exoplayer as player on my project and I want to add audio visualizer on my playeractivity . but I couldn't find any tutorial or sample for it . I just made a customeRenderfactory and add it to my player . But I don't know what should I do more ? any help or suggestion ? I should mention my player work fine right now and I just need to find a solution to add visualizer to it .
RenderFactory class :
public class RenderFactory extends DefaultRenderersFactory {
private TeeAudioProcessor.AudioBufferSink listener;
private AudioProcessor[] aMProcessors;
private TeeAudioProcessor teeAudioProcessor;
private AudioRendererEventListener eventListener;
public RenderFactory(Context context, TeeAudioProcessor.AudioBufferSink myListener) {
super(context);
this.listener = myListener;
teeAudioProcessor = new TeeAudioProcessor(this.listener);
}
#Override
protected void buildAudioRenderers(Context context, int extensionRendererMode, MediaCodecSelector mediaCodecSelector, #Nullable DrmSessionManager<FrameworkMediaCrypto> drmSessionManager, boolean playClearSamplesWithoutKeys, boolean enableDecoderFallback, AudioProcessor[] audioProcessors, Handler eventHandler, AudioRendererEventListener eventListener, ArrayList<Renderer> out) {
aMProcessors = new AudioProcessor[]{teeAudioProcessor};
super.buildAudioRenderers(context, extensionRendererMode, mediaCodecSelector, drmSessionManager, playClearSamplesWithoutKeys, enableDecoderFallback, aMProcessors, eventHandler, eventListener, out);
}
}
in my playerActivity I added this code and set renderfactory to my player but nothing happened .
RenderFactory renderFactory = new RenderFactory(this, new TeeAudioProcessor.AudioBufferSink() {
#Override
public void flush(int sampleRateHz, int channelCount, int encoding) {
// what should I add here?
}
#Override
public void handleBuffer(ByteBuffer buffer) {
// what should I add here?
}
});
I tried to followed this tutorial but I wasn't successful.
tutorial
To directly answer your question:
#Override
public void flush(int sampleRateHz, int channelCount, int encoding) {
// you dont have to do anything here
}
#Override
public void handleBuffer(ByteBuffer buffer) {
// This will give you the bytes from the sound that is going to be played
// here you apply FFT so the audio move to the frequency domain instead of time based
}
You can learn more about FFT here
FFT is a well known algorithm, so you may find it implemented on plenty os sites. You can use this library if yout want, but there is standalone codes around.
Once you have the FFT array, you can draw the values in a view.
You can use this guy's code as an example: https://github.com/dzolnai/ExoVisualizer
There's also a blog post he made about it: https://www.egeniq.com/blog/alternative-android-visualizer
I cannot figure out how to get a Mapbox map going in a custom view renderer on Android using Xamarin.Forms. It's driving me bonkers.
In my PCL, I have a map view.
public class MapView: View
{
public MapView() { }
}
For iOS, the "getting started" help was close enough to get it working on iOS, like so:
[assembly: ExportRenderer (typeof(Shared.Mobile.MapView), typeof(MapViewRenderer))]
namespace Clients.iOS
{
public class MapViewRenderer : ViewRenderer<Shared.Mobile.MapView, UIView>
{
protected override void OnElementChanged(ElementChangedEventArgs<Shared.Mobile.MapView> e)
{
base.OnElementChanged(e);
if (e.NewElement == null)
{
return;
}
var uiView = new UIView(new CoreGraphics.CGRect(0, 0, 500, 700));
SetNativeControl(uiView);
var mapView = new MapView(Control.Bounds);
mapView.SetCenterCoordinate(new CoreLocation.CLLocationCoordinate2D(40.81, -96.68), false);
mapView.SetZoomLevel(11, false);
mapView.AddAnnotation(new PointAnnotation
{
Coordinate = new CoreLocation.CLLocationCoordinate2D(40.81, -96.68),
Title = "Lincoln, NE",
Subtitle = "What-what"
});
uiView.AddSubview(mapView);
}
}
}
On the Android side, not so much. (https://components.xamarin.com/gettingstarted/mapboxsdk). They're putting in XML and an Activity of sorts, but my knowledge in mobile doesn't extend far from Xamarin.Forms at the moment, so I can't seem to bridge the gap between the two. My Android renderer looks like this:
public class MapViewRenderer : ViewRenderer<Shared.Mobile.MapView, Android.Views.View>
{
protected override void OnElementChanged(ElementChangedEventArgs<Shared.Mobile.MapView> e)
{
base.OnElementChanged(e);
var view = new Android.Views.View(Context);
SetNativeControl(view); // NullReferenceException will be thrown if the native control is not set
if (Control == null)
{
return;
}
var mapView = new MapView(Forms.Context, "thisismyaccesscode");
mapView.CenterCoordinate = new LatLng(41.885, -87.679);
mapView.ZoomLevel = 11;
mapView.SetMinimumHeight(250);
mapView.SetMinimumWidth(250);
mapView.AddMarker(new MarkerOptions().SetPosition(new LatLng(40.81, -96.68)).SetTitle("Lincoln, NE"));
view.AddSubview(mapView) // I wish this method existed
}
}
My final call to AddSubview(mapView) is not in fact a method of the View class as it is the UIView class on iOS. Here's where I'm stuck. I cannot figure out how to display the MapView. Please help.
As you have already mentioned you can't call AddSubview as it is an iOS method.
On Android its the equivalent of AddView.
However - You are attempting to do this type of operation on a Android View object and not on a ViewGroup object, so its not possible.
First instead of doing:-
var view = new Android.Views.View(Context);
try instantiating the MapView directly such like:-
var view = new MapView(Context, "thisismyaccesscode");
Your SetNativeControl call on the view is fine.
I haven't tried the Mapbox component, so I'm unclear on the exact parameter types it is expecting in the code above.
Should that not work, however, then do something like the following:-
var view = new Android.Widget.FrameLayout(Context);
var mapView = new MapView(Forms.Context, "thisismyaccesscode");
mapView.CenterCoordinate = new LatLng(41.885, -87.679);
mapView.ZoomLevel = 11;
mapView.SetMinimumHeight(250);
mapView.SetMinimumWidth(250);
mapView.AddMarker(new MarkerOptions().SetPosition(new LatLng(40.81, -96.68)).SetTitle("Lincoln, NE"));
view.AddView(mapView);
SetNativeControl(view);
You will have to change your first line to the following also:-
public class MapViewRenderer : ViewRenderer<Shared.Mobile.MapView, Android.Widget.FrameLayout>
I have searched today how to make Parcelable to share some objects between activities through intent. The all examples I found are with custom objects with basic data like int/string/arraylists etc. Is there a way to make a parcelable of a MediaPlayer object ? So far I have this:
public class mpParcelable implements Parcelable {
private MediaPlayer mp;
public int describeContents() {
return 0;
}
public void writeToParcel(Parcel out, int flags) {
out.writeValue(mp);
}
public static final Parcelable.Creator<mpParcelable> CREATOR
= new Parcelable.Creator<mpParcelable>() {
public mpParcelable createFromParcel(Parcel in) {
return new mpParcelable(in);
}
public mpParcelable[] newArray(int size) {
return new mpParcelable[size];
}
};
private mpParcelable(Parcel in) {
mp = in.readValue();
}
}
The part where I don't know what todo is on setter if I can say like that: where mp = in.readValue(); I don't know how to read the MediaPlayer object.
Is there a way to make a parcelable of a MediaPlayer object ?
No, because that is not your class. Occasionally, you can create a wrapper around some other class and the wrapper can be Parcelable, but I do not see how this would work with a MediaPlayer.
IMHO, what you want (sharing a MediaPlayer between components) is a code smell. Either each component should have its own MediaPlayer, or the MediaPlayer should be centrally managed (e.g., via a service).
I have VideoView instance. I need to know video source path from it.
Is it possible? Can anybody help me?
My code from WebChromeClient class is:
#Override
public void onShowCustomView(final View view, final CustomViewCallback callback) {
super.onShowCustomView(view, callback);
if (view instanceof FrameLayout) {
final FrameLayout frame = (FrameLayout) view;
if (frame.getFocusedChild() instanceof VideoView) {
// get video view
video = (VideoView) frame.getFocusedChild();
}
}
}
How to get video source path fron video object ?
VideoView doesn't have getters for video path/Uri. Your only chance is to use reflection. The Uri is stored in private Uri mUri. To access it you can use:
Uri mUri = null;
try {
Field mUriField = VideoView.class.getDeclaredField("mUri");
mUriField.setAccessible(true);
mUri = (Uri)mUriField.get(video);
} catch(Exception e) {}
Just bear in mind that a private field might be subject to change in future Android releases.
You can override the setVideoUriMethod in the VideoView if you do not like using private methods like this:
public class MyVideoView extends VideoView
{
Uri uri;
#Override
public void setVideoURI (Uri uri)
{
super.setVideoURI(uri);
this.uri = uri;
}
}
Now you can access the uri of the videoview as needed. Hope that helps.
Another alternative would be to set the video Uri/path on the tag of the view and retrieve later.
When you play/start
videoView.setVideoPath(localPath);
videoView.setTag(localPath);
When you want to check what's playing
String pathOfCurrentVideoPlaying = (String)videoView.getTag();
Just remember to clear out the tag if using in a adapter.
As per other questions android-video-as-a-live-wallpaper, is the only way to play a video in a live wallpaper is to decode it yourself?
Just use MediaPlayer instead of VideoView and use MediaPlayer.setSurface instead of MediaPlayer.setDisplay. If you use setDisplay the MediaPlayer trys to tell the SurfaceHolder to keep the screen on which isn't allowed for LiveWallpapers and will throw an error.
I use WebM/vpx8 video but this should work with whatever MediaPlayer supports (just put the video file in res/raw)
package com.justinbuser.nativecore;
import android.media.MediaPlayer;
import android.service.wallpaper.WallpaperService;
import android.view.SurfaceHolder;
import com.justinbuser.android.Log;
public class VideoWallpaperService extends WallpaperService
{
protected static int playheadTime = 0;
#Override
public Engine onCreateEngine()
{
return new VideoEngine();
}
class VideoEngine extends Engine
{
private final String TAG = getClass().getSimpleName();
private final MediaPlayer mediaPlayer;
public VideoEngine()
{
super();
Log.i( TAG, "( VideoEngine )");
mediaPlayer = MediaPlayer.create(getBaseContext(), R.raw.wallpapervideo);
mediaPlayer.setLooping(true);
}
#Override
public void onSurfaceCreated( SurfaceHolder holder )
{
Log.i( TAG, "onSurfaceCreated" );
mediaPlayer.setSurface(holder.getSurface());
mediaPlayer.start();
}
#Override
public void onSurfaceDestroyed( SurfaceHolder holder )
{
Log.i( TAG, "( INativeWallpaperEngine ): onSurfaceDestroyed" );
playheadTime = mediaPlayer.getCurrentPosition();
mediaPlayer.reset();
mediaPlayer.release();
}
}
}
Short answer is yes. Long answer is http://ikaruga2.wordpress.com/2011/06/15/video-live-wallpaper-part-1/
Just to think outside the box, is it possible to take a working video player and re-parent it under a java window in Android? I have not done this in Linux or Android, but under Windows it is possible to get the window handle of a running application and make it a child of a Java frame, with the result that the other application's window looks like its part of your Java application.
I have tried the Justin Buser solution and it does not work (tested on a API 16 device), have also found a similar code on https://github.com/thorikawa/AndroidExample/tree/master/MovieLiveWallpaper/; it does not work either.
The only solution seems to be to use FFMPEG with NDK. eg : https://github.com/frankandrobot