I am working on a radio app . I used exoplayer as player on my project and I want to add audio visualizer on my playeractivity . but I couldn't find any tutorial or sample for it . I just made a customeRenderfactory and add it to my player . But I don't know what should I do more ? any help or suggestion ? I should mention my player work fine right now and I just need to find a solution to add visualizer to it .
RenderFactory class :
public class RenderFactory extends DefaultRenderersFactory {
private TeeAudioProcessor.AudioBufferSink listener;
private AudioProcessor[] aMProcessors;
private TeeAudioProcessor teeAudioProcessor;
private AudioRendererEventListener eventListener;
public RenderFactory(Context context, TeeAudioProcessor.AudioBufferSink myListener) {
super(context);
this.listener = myListener;
teeAudioProcessor = new TeeAudioProcessor(this.listener);
}
#Override
protected void buildAudioRenderers(Context context, int extensionRendererMode, MediaCodecSelector mediaCodecSelector, #Nullable DrmSessionManager<FrameworkMediaCrypto> drmSessionManager, boolean playClearSamplesWithoutKeys, boolean enableDecoderFallback, AudioProcessor[] audioProcessors, Handler eventHandler, AudioRendererEventListener eventListener, ArrayList<Renderer> out) {
aMProcessors = new AudioProcessor[]{teeAudioProcessor};
super.buildAudioRenderers(context, extensionRendererMode, mediaCodecSelector, drmSessionManager, playClearSamplesWithoutKeys, enableDecoderFallback, aMProcessors, eventHandler, eventListener, out);
}
}
in my playerActivity I added this code and set renderfactory to my player but nothing happened .
RenderFactory renderFactory = new RenderFactory(this, new TeeAudioProcessor.AudioBufferSink() {
#Override
public void flush(int sampleRateHz, int channelCount, int encoding) {
// what should I add here?
}
#Override
public void handleBuffer(ByteBuffer buffer) {
// what should I add here?
}
});
I tried to followed this tutorial but I wasn't successful.
tutorial
To directly answer your question:
#Override
public void flush(int sampleRateHz, int channelCount, int encoding) {
// you dont have to do anything here
}
#Override
public void handleBuffer(ByteBuffer buffer) {
// This will give you the bytes from the sound that is going to be played
// here you apply FFT so the audio move to the frequency domain instead of time based
}
You can learn more about FFT here
FFT is a well known algorithm, so you may find it implemented on plenty os sites. You can use this library if yout want, but there is standalone codes around.
Once you have the FFT array, you can draw the values in a view.
You can use this guy's code as an example: https://github.com/dzolnai/ExoVisualizer
There's also a blog post he made about it: https://www.egeniq.com/blog/alternative-android-visualizer
Related
I do have a HLS livestream which is epresented by an URI pointing to the manifest file.
The manifest file does additionally define the subtitles.
When using the ExoPlayer I can use a TrackSelector, attach it to the ExoPlayer and so I do have the option to show the available subtitles to the user (and change it through the TrackSelector).
I want to do the same with just the CastPlayer. Here the simplest possible Activity I can imagine:
public class PlayerActivity extends AppCompatActivity implements SessionAvailabilityListener {
private MediaRouteButton castButton;
private CastPlayer castPlayer;
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.player_activity);
castButton = findViewById(R.id.castButton);
CastButtonFactory.setUpMediaRouteButton(this, castButton);
CastContext castContext = CastContext.getSharedInstance(this);
castPlayer = new CastPlayer(castContext);
castPlayer.setSessionAvailabilityListener(this);
}
#Override
public void onCastSessionAvailable() {
MediaMetadata movieMetadata = new MediaMetadata(MediaMetadata.MEDIA_TYPE_MOVIE);
MediaInfo mediaInfo = new MediaInfo.Builder("https://myuri.ch/video.m3u8")
.setStreamType(MediaInfo.STREAM_TYPE_BUFFERED)
.setMetadata(movieMetadata)
.build();
MediaQueueItem mediaItem = new MediaQueueItem.Builder(mediaInfo).build();
castPlayer.loadItems(new MediaQueueItem[]{mediaItem}, 0, 0, Player.REPEAT_MODE_OFF);
}
#Override
public void onCastSessionUnavailable() {
}
}
The layout looks like this:
<androidx.mediarouter.app.MediaRouteButton
android:id="#+id/castButton"
android:layout_alignParentEnd="true"
android:layout_alignParentTop="true"
android:layout_width="wrap_content"
android:layout_height="wrap_content">
</androidx.mediarouter.app.MediaRouteButton>
This activity starts up and when I hit the castButton it streams the movie behind "https://myuri.ch/video.m3u8" to the cast-device. Works perfectly :)
But I can't figure out how to allow the user to choose between subtitles from my app. What is the idea on how we should implement something like this?
Points I already found out:
I cannot attach something like a TrackSelector to the CastPlayer
The docuentation states that we can provide MediaTracks to the MediaInfo-Object. But I dont have this info, respectively it is hidden in the m3u8 file.
As additional info, my CastOptions look like this:
public class CastOptionsProvider implements OptionsProvider {
#Override
public CastOptions getCastOptions(Context appContext) {
return new CastOptions.Builder()
.setReceiverApplicationId(CastMediaControlIntent.DEFAULT_MEDIA_RECEIVER_APPLICATION_ID)
.build();
}
#Override
public List<SessionProvider> getAdditionalSessionProviders(Context context) {
return null;
}
}
Here what I found out:
The TrackSelector cannot be used for the CastPlayer.
Alle the available subtitles have to be available/known before you load MediaItems to the CastPlayer
The DefaultMediaReceiver needs a public available link to a *.vtt file. Something different does not work (no .webvtt nor .m3u8)
The easiest way to update the subtitles on the ExoPlayer seems to be this:
DefaultTrackSelector.ParametersBuilder parametersBuilder = trackSelector.getParameters()
.buildUpon()
.setRendererDisabled(getRendererIndex(C.TRACK_TYPE_TEXT), !hasSubtitles);
if (hasSubtitles) {
parametersBuilder.setPreferredTextLanguage("en").build();
}
Where getRendererIndex is this:
public int getRendererIndex(final int trackType) {
for (int t = 0; t < exoPlayer.getRendererCount(); t++) {
if (exoPlayer.getRendererType(t) == trackType) {
return t;
}
}
return -1;
}
To select a SubTitle on the CastPlayer you can use the RemoteMediaClient
remoteMediaClient.setActiveMediaTracks(selectedSubtitleTracks);
So my solution is this:
Before starting any stream (local or cast) get all the information I need for the MediaItem (url, subtitles, ...)
Depending on the availability of a Cast-Session I load the MediaItems to the CastPlayer or the ExoPlayer
A Custom subtitle-dialog gives the user the option to select a subtitle
When the user selects a subtitle, I use the methods described above to select the subtitle on the correct player.
I am implementing a music application from this tutorial.
There is a BaseAdapter class used to display the track list, and a MusicPlayer class to play the music. The both are variables of my main activity class.
public class MainActivity extends Activity implements MediaPlayerControl{
private MediaPlayer musicSrv;
private BaseAdapter songAdt;
...
The MusicPlayer play the next tracks when the current finish. What is the best way to send a message to the BaseAdapter to change the displaying at each new playing track (like changing the color of the current track)?
EDIT
According to the comments, it seems that the use of an interface good be a good option. Could someone write a detail answer that explains how to do it? Thanks.
Thanks to the comments, I managed to implement a solution with an interface.
This is my Main activity class, that refresh the BaseAdapater each time the song is changed:
public class MainActivity extends Activity implements MusicService.SongChangedListener {
private MediaPlayer musicSrv;
private BaseAdapter songAdt;
...
#Override
public void songChanged(Song currentSong){
songAdt.notifyDataSetChanged(); // refresh view
}
And my MusicService class:
public class MusicService extends Service implements MediaPlayer.OnPreparedListener{
...
private MainActivity activity;
...
public void setActivity(MainActivity act){
//set the activity
activity = act;
}
public interface SongChangedListener {
void songChanged(Song song);
}
...
public void playSong(){
// this function is called each time a new song is played
activity.songChanged(playSong);
...
}
Maintain one variable in your music track model class which indicates whether this song is in playing mode or not.
Check that value in your getView() and do coding according to it.
if(model.is_playing)
//change your code for playing song
else
//rest songs which are not playing
Now whenever you are changing songs manualy or automaticaly, change that is_playing value, unset it from previous track and set it to currently playing track.
BaseAdapter's method, getView() method will provide you with the view and you should change the color of your of current track by setting a variable in your list and reset that color to default when the variable is not set.
if (is this the current playing track) {
// Set the color of the view.
} else {
// Set the color to default.
}
If you have implemented this logic currently, then whenever you change the current track and also the variable in your list that tracks the current playing Media, a simple songAdt.notifyDataSetChanged() will ask the BaseAdapter to be called again and will set the view as per the new data. For More indepth understanding of ListView you can refer this talk. It will help.
Preferably consider training yourself with RecyclerView, its the present. ListView was a dreadful past.
public class Activity implements SongChangedListener {
...
#Override
onCreate() {
....
PlayerManager pManager = new PlayerManager();
}
onResume() {
pManager.setListener(this);
}
onPause() {
pManager.setListener(null);
}
#Override
void songChanged (MediaId idOfSong) {
if (getActivity == null) //If you received a callback after activity was killed.
return;
// Change the current song as not playing in List. (for your adapter)
// Change the idOfSong to currently playing in List (for your adapter).
// change currentSong = idOfSong;
// notify that the data in List has changed (songAdt.notifyDataSetChanged)
}
}
And in Your PlayerManager, you can create the interface, or maybe a seperate class for the interface, doesn't matter how you send the interface instance.
public class PlayerManager {
...
private SongChangedListener mListener;
...
public PlayerManager() {
}
public void setListener(SongChangedListener listener) {
mListener = listener;
}
public interface SongChangedListener {
void songChanged(MediaId idOfSong);
}
...
public void playSong() {
...
if (mListener != null)
mListener.songChanged(idOfNextSong);
...
}
In your answer you are passing an activity into your service, which feels wrong in many ways. If you want to implement communication between activity and service, there are many other ways to do this. Usually I use a Messenger in conjunction with a Handler. I would provide more details but it would be more beneficial if you explore it in documentation and in other answers. It is easy to implement once you understand how Messengers work.
Also, if you are looking for a fullfledged MediaPlayer Application, your implementation will require a lot more boiler code. Also you will have to handle MediaButton clicks(if someone clicked on play/pause on their bluetooth headphones or on their watch). Preferably MediaSessionCompat is a better implementation. You can also refer the following open Source MediaPlayer, which implements all the minimum required functionalities pretty nicely android-UniversalMusicPlayer.
You don't need to implement your own callback interface. Mediaplayer has already an oncompletionlistener when the playing sound is terminated. So you just need to refresh your adapter in the oncompletion method
public class MainActivity extends Activity implements MediaPlayerControl, MediaPlayer.OnCompletionListener{
private MediaPlayer musicSrv;
private BaseAdapter songAdt;
#Override
public void onCreate() {
musicSrv.setOnCompletionListener(this);
}
#Override
public void onCompletion(MediaPlayer musicPlayer) {
songAdt.notifyDataSetChanged();
}
}
I have searched today how to make Parcelable to share some objects between activities through intent. The all examples I found are with custom objects with basic data like int/string/arraylists etc. Is there a way to make a parcelable of a MediaPlayer object ? So far I have this:
public class mpParcelable implements Parcelable {
private MediaPlayer mp;
public int describeContents() {
return 0;
}
public void writeToParcel(Parcel out, int flags) {
out.writeValue(mp);
}
public static final Parcelable.Creator<mpParcelable> CREATOR
= new Parcelable.Creator<mpParcelable>() {
public mpParcelable createFromParcel(Parcel in) {
return new mpParcelable(in);
}
public mpParcelable[] newArray(int size) {
return new mpParcelable[size];
}
};
private mpParcelable(Parcel in) {
mp = in.readValue();
}
}
The part where I don't know what todo is on setter if I can say like that: where mp = in.readValue(); I don't know how to read the MediaPlayer object.
Is there a way to make a parcelable of a MediaPlayer object ?
No, because that is not your class. Occasionally, you can create a wrapper around some other class and the wrapper can be Parcelable, but I do not see how this would work with a MediaPlayer.
IMHO, what you want (sharing a MediaPlayer between components) is a code smell. Either each component should have its own MediaPlayer, or the MediaPlayer should be centrally managed (e.g., via a service).
I added admob to my libgdx project without any problem but How can I disable admob in game. I have 2 screen(MainMenu and PlayScreen) and I want to ads to be shown only at MainMenu.
I have found an article about conntrolling ads in libgdx but the problem is this article is for Desktop not Android.
https://code.google.com/p/libgdx/wiki/AdMobInLibgdx (Note: question arises in part from using deprecated document, newer version available at https://github.com/libgdx/libgdx/wiki/Admob-in-libgdx)
Take a look at the #control at the new wiki. There are 2 Final static values inside of your Android Project:
public class HelloWorldAndroid extends AndroidApplication {
private final int SHOW_ADS = 1;
private final int HIDE_ADS = 0;
protected Handler handler = new Handler()
{
#Override
public void handleMessage(Message msg) {
switch(msg.what) {
case SHOW_ADS:
{
adView.setVisibility(View.VISIBLE); //change to visible
break;
}
case HIDE_ADS:
{
adView.setVisibility(View.GONE);//change to not visible
// you should also disable the ad fetching here!
break;
}
}
}
};
So if you call the method: (which is parsed as interface to the core project)
public interface IActivityRequestHandler {
public void showAds(boolean show);
}
public class HelloWorldAndroid extends AndroidApplication implements IActivityRequestHandler {
...
// This is the callback that posts a message for the handler
#Override
public void showAds(boolean show) {
handler.sendEmptyMessage(show ? SHOW_ADS : HIDE_ADS);
}
it sends an message to the handler which activates or disable the admob. The interface for showAds is passed to the core project so you can hold an reference to it and use it. To see how this works take a look at the article of interfacing plattformspec code.
Just to show this here:
View gameView = initializeForView(new HelloWorld(this), false); // and "this" is the mainclass of the android project which implements the IActivityRequestHandler interface shown above
//the HelloWorld(this) is the core project where you now can save the `IActivityRequestHandler` as referance and call the showAds(bool)
But in the end if you would have read the aticle you should have know this all.
As per other questions android-video-as-a-live-wallpaper, is the only way to play a video in a live wallpaper is to decode it yourself?
Just use MediaPlayer instead of VideoView and use MediaPlayer.setSurface instead of MediaPlayer.setDisplay. If you use setDisplay the MediaPlayer trys to tell the SurfaceHolder to keep the screen on which isn't allowed for LiveWallpapers and will throw an error.
I use WebM/vpx8 video but this should work with whatever MediaPlayer supports (just put the video file in res/raw)
package com.justinbuser.nativecore;
import android.media.MediaPlayer;
import android.service.wallpaper.WallpaperService;
import android.view.SurfaceHolder;
import com.justinbuser.android.Log;
public class VideoWallpaperService extends WallpaperService
{
protected static int playheadTime = 0;
#Override
public Engine onCreateEngine()
{
return new VideoEngine();
}
class VideoEngine extends Engine
{
private final String TAG = getClass().getSimpleName();
private final MediaPlayer mediaPlayer;
public VideoEngine()
{
super();
Log.i( TAG, "( VideoEngine )");
mediaPlayer = MediaPlayer.create(getBaseContext(), R.raw.wallpapervideo);
mediaPlayer.setLooping(true);
}
#Override
public void onSurfaceCreated( SurfaceHolder holder )
{
Log.i( TAG, "onSurfaceCreated" );
mediaPlayer.setSurface(holder.getSurface());
mediaPlayer.start();
}
#Override
public void onSurfaceDestroyed( SurfaceHolder holder )
{
Log.i( TAG, "( INativeWallpaperEngine ): onSurfaceDestroyed" );
playheadTime = mediaPlayer.getCurrentPosition();
mediaPlayer.reset();
mediaPlayer.release();
}
}
}
Short answer is yes. Long answer is http://ikaruga2.wordpress.com/2011/06/15/video-live-wallpaper-part-1/
Just to think outside the box, is it possible to take a working video player and re-parent it under a java window in Android? I have not done this in Linux or Android, but under Windows it is possible to get the window handle of a running application and make it a child of a Java frame, with the result that the other application's window looks like its part of your Java application.
I have tried the Justin Buser solution and it does not work (tested on a API 16 device), have also found a similar code on https://github.com/thorikawa/AndroidExample/tree/master/MovieLiveWallpaper/; it does not work either.
The only solution seems to be to use FFMPEG with NDK. eg : https://github.com/frankandrobot