I have VideoView instance. I need to know video source path from it.
Is it possible? Can anybody help me?
My code from WebChromeClient class is:
#Override
public void onShowCustomView(final View view, final CustomViewCallback callback) {
super.onShowCustomView(view, callback);
if (view instanceof FrameLayout) {
final FrameLayout frame = (FrameLayout) view;
if (frame.getFocusedChild() instanceof VideoView) {
// get video view
video = (VideoView) frame.getFocusedChild();
}
}
}
How to get video source path fron video object ?
VideoView doesn't have getters for video path/Uri. Your only chance is to use reflection. The Uri is stored in private Uri mUri. To access it you can use:
Uri mUri = null;
try {
Field mUriField = VideoView.class.getDeclaredField("mUri");
mUriField.setAccessible(true);
mUri = (Uri)mUriField.get(video);
} catch(Exception e) {}
Just bear in mind that a private field might be subject to change in future Android releases.
You can override the setVideoUriMethod in the VideoView if you do not like using private methods like this:
public class MyVideoView extends VideoView
{
Uri uri;
#Override
public void setVideoURI (Uri uri)
{
super.setVideoURI(uri);
this.uri = uri;
}
}
Now you can access the uri of the videoview as needed. Hope that helps.
Another alternative would be to set the video Uri/path on the tag of the view and retrieve later.
When you play/start
videoView.setVideoPath(localPath);
videoView.setTag(localPath);
When you want to check what's playing
String pathOfCurrentVideoPlaying = (String)videoView.getTag();
Just remember to clear out the tag if using in a adapter.
Related
I am working on a radio app . I used exoplayer as player on my project and I want to add audio visualizer on my playeractivity . but I couldn't find any tutorial or sample for it . I just made a customeRenderfactory and add it to my player . But I don't know what should I do more ? any help or suggestion ? I should mention my player work fine right now and I just need to find a solution to add visualizer to it .
RenderFactory class :
public class RenderFactory extends DefaultRenderersFactory {
private TeeAudioProcessor.AudioBufferSink listener;
private AudioProcessor[] aMProcessors;
private TeeAudioProcessor teeAudioProcessor;
private AudioRendererEventListener eventListener;
public RenderFactory(Context context, TeeAudioProcessor.AudioBufferSink myListener) {
super(context);
this.listener = myListener;
teeAudioProcessor = new TeeAudioProcessor(this.listener);
}
#Override
protected void buildAudioRenderers(Context context, int extensionRendererMode, MediaCodecSelector mediaCodecSelector, #Nullable DrmSessionManager<FrameworkMediaCrypto> drmSessionManager, boolean playClearSamplesWithoutKeys, boolean enableDecoderFallback, AudioProcessor[] audioProcessors, Handler eventHandler, AudioRendererEventListener eventListener, ArrayList<Renderer> out) {
aMProcessors = new AudioProcessor[]{teeAudioProcessor};
super.buildAudioRenderers(context, extensionRendererMode, mediaCodecSelector, drmSessionManager, playClearSamplesWithoutKeys, enableDecoderFallback, aMProcessors, eventHandler, eventListener, out);
}
}
in my playerActivity I added this code and set renderfactory to my player but nothing happened .
RenderFactory renderFactory = new RenderFactory(this, new TeeAudioProcessor.AudioBufferSink() {
#Override
public void flush(int sampleRateHz, int channelCount, int encoding) {
// what should I add here?
}
#Override
public void handleBuffer(ByteBuffer buffer) {
// what should I add here?
}
});
I tried to followed this tutorial but I wasn't successful.
tutorial
To directly answer your question:
#Override
public void flush(int sampleRateHz, int channelCount, int encoding) {
// you dont have to do anything here
}
#Override
public void handleBuffer(ByteBuffer buffer) {
// This will give you the bytes from the sound that is going to be played
// here you apply FFT so the audio move to the frequency domain instead of time based
}
You can learn more about FFT here
FFT is a well known algorithm, so you may find it implemented on plenty os sites. You can use this library if yout want, but there is standalone codes around.
Once you have the FFT array, you can draw the values in a view.
You can use this guy's code as an example: https://github.com/dzolnai/ExoVisualizer
There's also a blog post he made about it: https://www.egeniq.com/blog/alternative-android-visualizer
I do have a HLS livestream which is epresented by an URI pointing to the manifest file.
The manifest file does additionally define the subtitles.
When using the ExoPlayer I can use a TrackSelector, attach it to the ExoPlayer and so I do have the option to show the available subtitles to the user (and change it through the TrackSelector).
I want to do the same with just the CastPlayer. Here the simplest possible Activity I can imagine:
public class PlayerActivity extends AppCompatActivity implements SessionAvailabilityListener {
private MediaRouteButton castButton;
private CastPlayer castPlayer;
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.player_activity);
castButton = findViewById(R.id.castButton);
CastButtonFactory.setUpMediaRouteButton(this, castButton);
CastContext castContext = CastContext.getSharedInstance(this);
castPlayer = new CastPlayer(castContext);
castPlayer.setSessionAvailabilityListener(this);
}
#Override
public void onCastSessionAvailable() {
MediaMetadata movieMetadata = new MediaMetadata(MediaMetadata.MEDIA_TYPE_MOVIE);
MediaInfo mediaInfo = new MediaInfo.Builder("https://myuri.ch/video.m3u8")
.setStreamType(MediaInfo.STREAM_TYPE_BUFFERED)
.setMetadata(movieMetadata)
.build();
MediaQueueItem mediaItem = new MediaQueueItem.Builder(mediaInfo).build();
castPlayer.loadItems(new MediaQueueItem[]{mediaItem}, 0, 0, Player.REPEAT_MODE_OFF);
}
#Override
public void onCastSessionUnavailable() {
}
}
The layout looks like this:
<androidx.mediarouter.app.MediaRouteButton
android:id="#+id/castButton"
android:layout_alignParentEnd="true"
android:layout_alignParentTop="true"
android:layout_width="wrap_content"
android:layout_height="wrap_content">
</androidx.mediarouter.app.MediaRouteButton>
This activity starts up and when I hit the castButton it streams the movie behind "https://myuri.ch/video.m3u8" to the cast-device. Works perfectly :)
But I can't figure out how to allow the user to choose between subtitles from my app. What is the idea on how we should implement something like this?
Points I already found out:
I cannot attach something like a TrackSelector to the CastPlayer
The docuentation states that we can provide MediaTracks to the MediaInfo-Object. But I dont have this info, respectively it is hidden in the m3u8 file.
As additional info, my CastOptions look like this:
public class CastOptionsProvider implements OptionsProvider {
#Override
public CastOptions getCastOptions(Context appContext) {
return new CastOptions.Builder()
.setReceiverApplicationId(CastMediaControlIntent.DEFAULT_MEDIA_RECEIVER_APPLICATION_ID)
.build();
}
#Override
public List<SessionProvider> getAdditionalSessionProviders(Context context) {
return null;
}
}
Here what I found out:
The TrackSelector cannot be used for the CastPlayer.
Alle the available subtitles have to be available/known before you load MediaItems to the CastPlayer
The DefaultMediaReceiver needs a public available link to a *.vtt file. Something different does not work (no .webvtt nor .m3u8)
The easiest way to update the subtitles on the ExoPlayer seems to be this:
DefaultTrackSelector.ParametersBuilder parametersBuilder = trackSelector.getParameters()
.buildUpon()
.setRendererDisabled(getRendererIndex(C.TRACK_TYPE_TEXT), !hasSubtitles);
if (hasSubtitles) {
parametersBuilder.setPreferredTextLanguage("en").build();
}
Where getRendererIndex is this:
public int getRendererIndex(final int trackType) {
for (int t = 0; t < exoPlayer.getRendererCount(); t++) {
if (exoPlayer.getRendererType(t) == trackType) {
return t;
}
}
return -1;
}
To select a SubTitle on the CastPlayer you can use the RemoteMediaClient
remoteMediaClient.setActiveMediaTracks(selectedSubtitleTracks);
So my solution is this:
Before starting any stream (local or cast) get all the information I need for the MediaItem (url, subtitles, ...)
Depending on the availability of a Cast-Session I load the MediaItems to the CastPlayer or the ExoPlayer
A Custom subtitle-dialog gives the user the option to select a subtitle
When the user selects a subtitle, I use the methods described above to select the subtitle on the correct player.
i need to pick multiple videos from gallery, and I've implemented a custom gallery starting from https://github.com/luminousman/MultipleImagePick .
When i'm loading thumbnails in the adapter, it gives me OutOfMemoryException after i starti the gallery Activity twice sequentially.
The code on adapter:
VideoThumbnailImageLoader thumb=
new VideoThumbnailImageLoader(
thumbPath,
MediaStore.Video.Thumbnails.MICRO_KIND);
holder.imgQueue.setImage(thumb,
R.drawable.no_media);
and the VideoThumbnailImageLoader code:
public class VideoThumbnailImageLoader implements SmartImage {
private String videoPath;
private int thumbnailKind;
public VideoThumbnailImageLoader(String videoPath, int thumbnailKind) {
this.videoPath=videoPath;
this.thumbnailKind=thumbnailKind;
}
#Override
public Bitmap getBitmap(Context ctxt) {
return ThumbnailUtils.createVideoThumbnail(videoPath, MediaStore.Images.Thumbnails.MICRO_KIND);
}
}
I'm using http://loopj.com/android-smart-image-view/ to load video thumbnail.
How can I avoid that?
Set android:largeHeap="true" in your activity proAndroidManifest.xml
So, I'm making an app with Scrollable Tabs + Swipe navigation.
In every tab's page I want to play a different audio file.
Below is my fragment's OnCreateView, containing initialization of the mediaplayer, FileDescriptor and playing an audio file named a.mp3 in the assets folder.
#Override
public View onCreateView(LayoutInflater inflater, ViewGroup container,
Bundle savedInstanceState) {
View rootView = inflater.inflate(R.layout.fragment_main_dummy,
container, false);
///Playing sound from here on
AssetFileDescriptor fda;
MediaPlayer amp = new MediaPlayer();
try {
fda = getAssets().openFd("a.mp3");//// GIVES ERROR !
amp.reset();
amp.setDataSource(fda.getFileDescriptor());
amp.prepare();
amp.start();
} catch (IllegalArgumentException e) {
e.printStackTrace();
} catch (IllegalStateException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
return rootView;
}
}
The GetAssets() method gives an error as following :
Cannot make a static reference to the non-static method getAssets() from the type ContextWrapper
Although this same piece of code from declaring the FileDescriptor to the final Catch Statement works perfectly in a normal blank activity's OnCreate. It's not working here.
Any solutions for this ?
Can I make the getAssets() method static somehow ?
Any other way to access the audio file from the Fragment ?
(Remember, my goal is to play a different audio file in each of the different tab's screens. I'll add more audio files later, Just trying to get at least this one to work first.)
Pls Help :)
Thank you !
You need to use a Context object, so in this example you could use:
rootView.getContext().getAssets().openFd("a.mp3");
That said, I would suggest moving this code later in the Fragment Lifecycle in onActivityCreated or onStart after the view hierarchy has been instantiated. Putting this code in onCreateView could delay/slow down showing the UI to the user.
From those later lifecycle methods, you can safely call:
getResources().getAssets().openFd("a.mp3");
Just replace getAssets() with context.getAssets() :)
You can specify following 2 method in your utility class. They return AssetManager for you:
public static AssetManager getMyAssets(Context context)
{
return context.getResources().getAssets();
}
public static AssetManager getMyAssets(View view)
{
return view.getContext().getResources().getAssets();
}
Now you can use them like this:
fda = myUtil.getMyAssets(rootView).openFd("a.mp3");
OR
fda = myUtil.getMyAssets(rootView.getContext()).openFd("a.mp3");
As per other questions android-video-as-a-live-wallpaper, is the only way to play a video in a live wallpaper is to decode it yourself?
Just use MediaPlayer instead of VideoView and use MediaPlayer.setSurface instead of MediaPlayer.setDisplay. If you use setDisplay the MediaPlayer trys to tell the SurfaceHolder to keep the screen on which isn't allowed for LiveWallpapers and will throw an error.
I use WebM/vpx8 video but this should work with whatever MediaPlayer supports (just put the video file in res/raw)
package com.justinbuser.nativecore;
import android.media.MediaPlayer;
import android.service.wallpaper.WallpaperService;
import android.view.SurfaceHolder;
import com.justinbuser.android.Log;
public class VideoWallpaperService extends WallpaperService
{
protected static int playheadTime = 0;
#Override
public Engine onCreateEngine()
{
return new VideoEngine();
}
class VideoEngine extends Engine
{
private final String TAG = getClass().getSimpleName();
private final MediaPlayer mediaPlayer;
public VideoEngine()
{
super();
Log.i( TAG, "( VideoEngine )");
mediaPlayer = MediaPlayer.create(getBaseContext(), R.raw.wallpapervideo);
mediaPlayer.setLooping(true);
}
#Override
public void onSurfaceCreated( SurfaceHolder holder )
{
Log.i( TAG, "onSurfaceCreated" );
mediaPlayer.setSurface(holder.getSurface());
mediaPlayer.start();
}
#Override
public void onSurfaceDestroyed( SurfaceHolder holder )
{
Log.i( TAG, "( INativeWallpaperEngine ): onSurfaceDestroyed" );
playheadTime = mediaPlayer.getCurrentPosition();
mediaPlayer.reset();
mediaPlayer.release();
}
}
}
Short answer is yes. Long answer is http://ikaruga2.wordpress.com/2011/06/15/video-live-wallpaper-part-1/
Just to think outside the box, is it possible to take a working video player and re-parent it under a java window in Android? I have not done this in Linux or Android, but under Windows it is possible to get the window handle of a running application and make it a child of a Java frame, with the result that the other application's window looks like its part of your Java application.
I have tried the Justin Buser solution and it does not work (tested on a API 16 device), have also found a similar code on https://github.com/thorikawa/AndroidExample/tree/master/MovieLiveWallpaper/; it does not work either.
The only solution seems to be to use FFMPEG with NDK. eg : https://github.com/frankandrobot