SoundManager, how to stop sound playing - android

I use simple SoundManager for play sounds from sounpool. My problem is, sound not stop playing when I exit from app, even I use mySound.release();.
Otherwise, everything works as it should, here is my code.
import android.content.Context;
import android.media.AudioManager;
import android.media.SoundPool;
public class SoundManager {
private Context pContext;
private SoundPool mySound;
private float rate = 1.0f;
private float masterVolume = 1.0f;
private float leftVolume = 1.0f;
private float rightVolume = 1.0f;
private float balance = 0.5f;
// Constructor, setup the audio manager and store the app context
public SoundManager(Context appContext)
{
mySound = new SoundPool(16, AudioManager.STREAM_MUSIC, 100);
pContext = appContext;
}
// Load up a sound and return the id
public int load(int sound_id)
{
return mySound.load(pContext, sound_id, 1);
}
// Play a sound
public void play(int sound_id)
{
mySound.play(sound_id, leftVolume, rightVolume, 1, 0, rate);
}
// Set volume values based on existing balance value
public void setVolume(float vol)
{
masterVolume = vol;
if(balance < 1.0f)
{
leftVolume = masterVolume;
rightVolume = masterVolume * balance;
}
else
{
rightVolume = masterVolume;
leftVolume = masterVolume * ( 2.0f - balance );
}
}
public void unloadAll()
{
mySound.release();
}
}

The method play() return a int StreamId. You need to keep this StreamId on a variable and to call the stop method. For example:
Declare a Stream Id variable:
private int mStreamId;
Then, keep it when you start a new sound stream:
// Play a sound
public void play(int sound_id)
{
mStreamId = mySound.play(sound_id, leftVolume, rightVolume, 1, 0, rate);
}
So you can stop it doing:
mySound.stop(mStreamId);
Now you can stop your sound stream on the activities events onStop, onPause or where you want.
EDIT:
The best way to release your SoundPool resource is call release and set the reference to null after you stop playing the sound. For example:
#Override
protected void onPause() {
super.onPause();
mSoundManager.unloadAll();
}
On your SoundManager, change your unloadAll method to:
public void unloadAll()
{
mySound.release();
mySound = null;
}
Note the following:
onPause – is always called when the activity is about to go into the
background.
release: releases all the resources used by the SoundPool
object null: this SoundPool object can no longer be used (like #DanXPrado
said) so set it to null to help the Garbage Collector identify it as
reclaimable.
For more details, see this tutorial about SoundPool.
Hope this helps.

Related

During crossfading MediaPlayer player pauses for while on some devices mostly on Oreo and Pie

I want crossfading effect when one song is about to end and other starts.I have used below code for crossFading between audio and it works fine on most of the devices but it is not working on some samsung devices running on Oreo and one plus 6. Their is very small pause as soon as second media player starts playing next song. Thanks in advance.
private void crossFade() {
fadeOut(musicPlayer, CROSSFADE_DURATION);
fadeIn(musicPlayer2, CROSSFADE_DURATION);
}
public void fadeOut(final MediaPlayer _player, final int duration) {
final float deviceVolume = getDeviceVolume();
final Handler h = new Handler();
h.postDelayed(new Runnable() {
private float time = duration;
private float volume = 0.0f;
#Override
public void run() {
// can call h again after work!
time -= 100;
volume = (deviceVolume * time) / duration;
_player.setVolume(volume, volume);
if (time > 0)
h.postDelayed(this, 100);
else {
_player.stop();
_player.release();
}
}
}, 100); // delay (takes millis)
}
public void fadeIn(final MediaPlayer _player, final int duration) {
final float deviceVolume = getDeviceVolume();
final Handler h = new Handler();
h.postDelayed(new Runnable() {
private float time = 0.0f;
private float volume = 0.0f;
#Override
public void run() {
if (!_player.isPlaying())
_player.start();
// can call h again after work!
time += 100;
volume = (deviceVolume * time) / duration;
_player.setVolume(volume, volume);
if (time < duration)
h.postDelayed(this, 100);
}
}, 100); // delay (takes millis)
}
Finally I solved this issue by my own. Actualy MediaPlayer has this issue and its been reported to google since long ago but still not resolved(https://issuetracker.google.com/issues/36931073) so we can do nothing about it. So I used Exoplayer for playing audio and it works very smoothly without any pause.

How to fit video in Live wallpaper, by center-crop and by fitting to width/height?

Background
I'm making a live wallpaper that can show a video. In the beginning I thought this is going to be very hard, so some people suggested using OpenGL solutions or other, very complex solutions (such as this one).
Anyway, for this, I've found various places talking about it, and based on this github library (which has some bugs), I finally got it to work.
The problem
While I've succeeded showing a video, I can't find the way to control how it's shown compared to the screen resolution.
Currently it always gets to be stretched to the screen size, meaning that this (video taken from here) :
gets to show as this:
Reason is the different aspect ratio : 560x320 (video resolution) vs 1080x1920 (device resolution).
Note: I'm well aware of solutions of scaling videos, that are available on various Github repositories (such as here), but I'm asking about a live wallpaper. As such, it doesn't have a View, so it's more limited about how to do things. To be more specifically, a solution can't have any kind of layout, a TextureView or a SurfaceView, or any other kind of View.
What I've tried
I tried to play with various fields and functions of the SurfaceHolder, but with no luck so far. Examples:
setVideoScalingMode - it either crashes or doesn't do anything.
changing surfaceFrame - same.
Here's the current code I've made (full project available here) :
class MovieLiveWallpaperService : WallpaperService() {
override fun onCreateEngine(): WallpaperService.Engine {
return VideoLiveWallpaperEngine()
}
private enum class PlayerState {
NONE, PREPARING, READY, PLAYING
}
inner class VideoLiveWallpaperEngine : WallpaperService.Engine() {
private var mp: MediaPlayer? = null
private var playerState: PlayerState = PlayerState.NONE
override fun onSurfaceCreated(holder: SurfaceHolder) {
super.onSurfaceCreated(holder)
Log.d("AppLog", "onSurfaceCreated")
mp = MediaPlayer()
val mySurfaceHolder = MySurfaceHolder(holder)
mp!!.setDisplay(mySurfaceHolder)
mp!!.isLooping = true
mp!!.setVolume(0.0f, 0.0f)
mp!!.setOnPreparedListener { mp ->
playerState = PlayerState.READY
setPlay(true)
}
try {
//mp!!.setDataSource(this#MovieLiveWallpaperService, Uri.parse("http://techslides.com/demos/sample-videos/small.mp4"))
mp!!.setDataSource(this#MovieLiveWallpaperService, Uri.parse("android.resource://" + packageName + "/" + R.raw.small))
} catch (e: Exception) {
}
}
override fun onDestroy() {
super.onDestroy()
Log.d("AppLog", "onDestroy")
if (mp == null)
return
mp!!.stop()
mp!!.release()
playerState = PlayerState.NONE
}
private fun setPlay(play: Boolean) {
if (mp == null)
return
if (play == mp!!.isPlaying)
return
when {
!play -> {
mp!!.pause()
playerState = PlayerState.READY
}
mp!!.isPlaying -> return
playerState == PlayerState.READY -> {
Log.d("AppLog", "ready, so starting to play")
mp!!.start()
playerState = PlayerState.PLAYING
}
playerState == PlayerState.NONE -> {
Log.d("AppLog", "not ready, so preparing")
mp!!.prepareAsync()
playerState = PlayerState.PREPARING
}
}
}
override fun onVisibilityChanged(visible: Boolean) {
super.onVisibilityChanged(visible)
Log.d("AppLog", "onVisibilityChanged:" + visible + " " + playerState)
if (mp == null)
return
setPlay(visible)
}
}
class MySurfaceHolder(private val surfaceHolder: SurfaceHolder) : SurfaceHolder {
override fun addCallback(callback: SurfaceHolder.Callback) = surfaceHolder.addCallback(callback)
override fun getSurface() = surfaceHolder.surface!!
override fun getSurfaceFrame() = surfaceHolder.surfaceFrame
override fun isCreating(): Boolean = surfaceHolder.isCreating
override fun lockCanvas(): Canvas = surfaceHolder.lockCanvas()
override fun lockCanvas(dirty: Rect): Canvas = surfaceHolder.lockCanvas(dirty)
override fun removeCallback(callback: SurfaceHolder.Callback) = surfaceHolder.removeCallback(callback)
override fun setFixedSize(width: Int, height: Int) = surfaceHolder.setFixedSize(width, height)
override fun setFormat(format: Int) = surfaceHolder.setFormat(format)
override fun setKeepScreenOn(screenOn: Boolean) {}
override fun setSizeFromLayout() = surfaceHolder.setSizeFromLayout()
override fun setType(type: Int) = surfaceHolder.setType(type)
override fun unlockCanvasAndPost(canvas: Canvas) = surfaceHolder.unlockCanvasAndPost(canvas)
}
}
The questions
I'd like to know how to adjust the scale the content based on what we have for ImageView, all while keeping the aspect ratio :
center-crop - fits to 100% of the container (the screen in this case), cropping on sides (top&bottom or left&right) when needed. Doesn't stretch anything. This means the content seems fine, but not all of it might be shown.
fit-center - stretch to fit width/height
center-inside - set as original size, centered, and stretch to fit width/height only if too large.
You can achieve this with a TextureView. (surfaceView won't work either).I have found some code which will help you for achieving this.
in this demo you can crop the video in three type center, top & bottom.
TextureVideoView.java
public class TextureVideoView extends TextureView implements TextureView.SurfaceTextureListener {
// Indicate if logging is on
public static final boolean LOG_ON = true;
// Log tag
private static final String TAG = TextureVideoView.class.getName();
private MediaPlayer mMediaPlayer;
private float mVideoHeight;
private float mVideoWidth;
private boolean mIsDataSourceSet;
private boolean mIsViewAvailable;
private boolean mIsVideoPrepared;
private boolean mIsPlayCalled;
private ScaleType mScaleType;
private State mState;
public enum ScaleType {
CENTER_CROP, TOP, BOTTOM
}
public enum State {
UNINITIALIZED, PLAY, STOP, PAUSE, END
}
public TextureVideoView(Context context) {
super(context);
initView();
}
public TextureVideoView(Context context, AttributeSet attrs) {
super(context, attrs);
initView();
}
public TextureVideoView(Context context, AttributeSet attrs, int defStyle) {
super(context, attrs, defStyle);
initView();
}
private void initView() {
initPlayer();
setScaleType(ScaleType.CENTER_CROP);
setSurfaceTextureListener(this);
}
public void setScaleType(ScaleType scaleType) {
mScaleType = scaleType;
}
private void updateTextureViewSize() {
float viewWidth = getWidth();
float viewHeight = getHeight();
float scaleX = 1.0f;
float scaleY = 1.0f;
if (mVideoWidth > viewWidth && mVideoHeight > viewHeight) {
scaleX = mVideoWidth / viewWidth;
scaleY = mVideoHeight / viewHeight;
} else if (mVideoWidth < viewWidth && mVideoHeight < viewHeight) {
scaleY = viewWidth / mVideoWidth;
scaleX = viewHeight / mVideoHeight;
} else if (viewWidth > mVideoWidth) {
scaleY = (viewWidth / mVideoWidth) / (viewHeight / mVideoHeight);
} else if (viewHeight > mVideoHeight) {
scaleX = (viewHeight / mVideoHeight) / (viewWidth / mVideoWidth);
}
// Calculate pivot points, in our case crop from center
int pivotPointX;
int pivotPointY;
switch (mScaleType) {
case TOP:
pivotPointX = 0;
pivotPointY = 0;
break;
case BOTTOM:
pivotPointX = (int) (viewWidth);
pivotPointY = (int) (viewHeight);
break;
case CENTER_CROP:
pivotPointX = (int) (viewWidth / 2);
pivotPointY = (int) (viewHeight / 2);
break;
default:
pivotPointX = (int) (viewWidth / 2);
pivotPointY = (int) (viewHeight / 2);
break;
}
Matrix matrix = new Matrix();
matrix.setScale(scaleX, scaleY, pivotPointX, pivotPointY);
setTransform(matrix);
}
private void initPlayer() {
if (mMediaPlayer == null) {
mMediaPlayer = new MediaPlayer();
} else {
mMediaPlayer.reset();
}
mIsVideoPrepared = false;
mIsPlayCalled = false;
mState = State.UNINITIALIZED;
}
/**
* #see MediaPlayer#setDataSource(String)
*/
public void setDataSource(String path) {
initPlayer();
try {
mMediaPlayer.setDataSource(path);
mIsDataSourceSet = true;
prepare();
} catch (IOException e) {
Log.d(TAG, e.getMessage());
}
}
/**
* #see MediaPlayer#setDataSource(Context, Uri)
*/
public void setDataSource(Context context, Uri uri) {
initPlayer();
try {
mMediaPlayer.setDataSource(context, uri);
mIsDataSourceSet = true;
prepare();
} catch (IOException e) {
Log.d(TAG, e.getMessage());
}
}
/**
* #see MediaPlayer#setDataSource(java.io.FileDescriptor)
*/
public void setDataSource(AssetFileDescriptor afd) {
initPlayer();
try {
long startOffset = afd.getStartOffset();
long length = afd.getLength();
mMediaPlayer.setDataSource(afd.getFileDescriptor(), startOffset, length);
mIsDataSourceSet = true;
prepare();
} catch (IOException e) {
Log.d(TAG, e.getMessage());
}
}
private void prepare() {
try {
mMediaPlayer.setOnVideoSizeChangedListener(
new MediaPlayer.OnVideoSizeChangedListener() {
#Override
public void onVideoSizeChanged(MediaPlayer mp, int width, int height) {
mVideoWidth = width;
mVideoHeight = height;
updateTextureViewSize();
}
}
);
mMediaPlayer.setOnCompletionListener(new MediaPlayer.OnCompletionListener() {
#Override
public void onCompletion(MediaPlayer mp) {
mState = State.END;
log("Video has ended.");
if (mListener != null) {
mListener.onVideoEnd();
}
}
});
// don't forget to call MediaPlayer.prepareAsync() method when you use constructor for
// creating MediaPlayer
mMediaPlayer.prepareAsync();
// Play video when the media source is ready for playback.
mMediaPlayer.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {
#Override
public void onPrepared(MediaPlayer mediaPlayer) {
mIsVideoPrepared = true;
if (mIsPlayCalled && mIsViewAvailable) {
log("Player is prepared and play() was called.");
play();
}
if (mListener != null) {
mListener.onVideoPrepared();
}
}
});
} catch (IllegalArgumentException e) {
Log.d(TAG, e.getMessage());
} catch (SecurityException e) {
Log.d(TAG, e.getMessage());
} catch (IllegalStateException e) {
Log.d(TAG, e.toString());
}
}
/**
* Play or resume video. Video will be played as soon as view is available and media player is
* prepared.
*
* If video is stopped or ended and play() method was called, video will start over.
*/
public void play() {
if (!mIsDataSourceSet) {
log("play() was called but data source was not set.");
return;
}
mIsPlayCalled = true;
if (!mIsVideoPrepared) {
log("play() was called but video is not prepared yet, waiting.");
return;
}
if (!mIsViewAvailable) {
log("play() was called but view is not available yet, waiting.");
return;
}
if (mState == State.PLAY) {
log("play() was called but video is already playing.");
return;
}
if (mState == State.PAUSE) {
log("play() was called but video is paused, resuming.");
mState = State.PLAY;
mMediaPlayer.start();
return;
}
if (mState == State.END || mState == State.STOP) {
log("play() was called but video already ended, starting over.");
mState = State.PLAY;
mMediaPlayer.seekTo(0);
mMediaPlayer.start();
return;
}
mState = State.PLAY;
mMediaPlayer.start();
}
/**
* Pause video. If video is already paused, stopped or ended nothing will happen.
*/
public void pause() {
if (mState == State.PAUSE) {
log("pause() was called but video already paused.");
return;
}
if (mState == State.STOP) {
log("pause() was called but video already stopped.");
return;
}
if (mState == State.END) {
log("pause() was called but video already ended.");
return;
}
mState = State.PAUSE;
if (mMediaPlayer.isPlaying()) {
mMediaPlayer.pause();
}
}
/**
* Stop video (pause and seek to beginning). If video is already stopped or ended nothing will
* happen.
*/
public void stop() {
if (mState == State.STOP) {
log("stop() was called but video already stopped.");
return;
}
if (mState == State.END) {
log("stop() was called but video already ended.");
return;
}
mState = State.STOP;
if (mMediaPlayer.isPlaying()) {
mMediaPlayer.pause();
mMediaPlayer.seekTo(0);
}
}
/**
* #see MediaPlayer#setLooping(boolean)
*/
public void setLooping(boolean looping) {
mMediaPlayer.setLooping(looping);
}
/**
* #see MediaPlayer#seekTo(int)
*/
public void seekTo(int milliseconds) {
mMediaPlayer.seekTo(milliseconds);
}
/**
* #see MediaPlayer#getDuration()
*/
public int getDuration() {
return mMediaPlayer.getDuration();
}
static void log(String message) {
if (LOG_ON) {
Log.d(TAG, message);
}
}
private MediaPlayerListener mListener;
/**
* Listener trigger 'onVideoPrepared' and `onVideoEnd` events
*/
public void setListener(MediaPlayerListener listener) {
mListener = listener;
}
public interface MediaPlayerListener {
public void onVideoPrepared();
public void onVideoEnd();
}
#Override
public void onSurfaceTextureAvailable(SurfaceTexture surfaceTexture, int width, int height) {
Surface surface = new Surface(surfaceTexture);
mMediaPlayer.setSurface(surface);
mIsViewAvailable = true;
if (mIsDataSourceSet && mIsPlayCalled && mIsVideoPrepared) {
log("View is available and play() was called.");
play();
}
}
#Override
public void onSurfaceTextureSizeChanged(SurfaceTexture surface, int width, int height) {
}
#Override
public boolean onSurfaceTextureDestroyed(SurfaceTexture surface) {
return false;
}
#Override
public void onSurfaceTextureUpdated(SurfaceTexture surface) {
}
}
After that use this class like the below code in MainActivity.java
public class MainActivity extends AppCompatActivity implements View.OnClickListener,
ActionBar.OnNavigationListener {
// Video file url
private static final String FILE_URL = "http://techslides.com/demos/sample-videos/small.mp4";
private TextureVideoView mTextureVideoView;
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
initView();
initActionBar();
if (!isWIFIOn(getBaseContext())) {
Toast.makeText(getBaseContext(), "You need internet connection to stream video",
Toast.LENGTH_LONG).show();
}
}
private void initActionBar() {
ActionBar actionBar = getSupportActionBar();
actionBar.setNavigationMode(ActionBar.NAVIGATION_MODE_LIST);
actionBar.setDisplayShowTitleEnabled(false);
SpinnerAdapter mSpinnerAdapter = ArrayAdapter.createFromResource(this, R.array.action_list,
android.R.layout.simple_spinner_dropdown_item);
actionBar.setListNavigationCallbacks(mSpinnerAdapter, this);
}
private void initView() {
mTextureVideoView = (TextureVideoView) findViewById(R.id.cropTextureView);
findViewById(R.id.btnPlay).setOnClickListener(this);
findViewById(R.id.btnPause).setOnClickListener(this);
findViewById(R.id.btnStop).setOnClickListener(this);
}
#Override
public void onClick(View v) {
switch (v.getId()) {
case R.id.btnPlay:
mTextureVideoView.play();
break;
case R.id.btnPause:
mTextureVideoView.pause();
break;
case R.id.btnStop:
mTextureVideoView.stop();
break;
}
}
final int indexCropCenter = 0;
final int indexCropTop = 1;
final int indexCropBottom = 2;
#Override
public boolean onNavigationItemSelected(int itemPosition, long itemId) {
switch (itemPosition) {
case indexCropCenter:
mTextureVideoView.stop();
mTextureVideoView.setScaleType(TextureVideoView.ScaleType.CENTER_CROP);
mTextureVideoView.setDataSource(FILE_URL);
mTextureVideoView.play();
break;
case indexCropTop:
mTextureVideoView.stop();
mTextureVideoView.setScaleType(TextureVideoView.ScaleType.TOP);
mTextureVideoView.setDataSource(FILE_URL);
mTextureVideoView.play();
break;
case indexCropBottom:
mTextureVideoView.stop();
mTextureVideoView.setScaleType(TextureVideoView.ScaleType.BOTTOM);
mTextureVideoView.setDataSource(FILE_URL);
mTextureVideoView.play();
break;
}
return true;
}
public static boolean isWIFIOn(Context context) {
ConnectivityManager connMgr =
(ConnectivityManager) context.getSystemService(Context.CONNECTIVITY_SERVICE);
NetworkInfo networkInfo = connMgr.getNetworkInfo(ConnectivityManager.TYPE_WIFI);
return (networkInfo != null && networkInfo.isConnected());
}
}
and layout activity_main.xml file for that is below
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="fill_parent"
android:layout_height="fill_parent">
<com.example.videocropdemo.crop.TextureVideoView
android:id="#+id/cropTextureView"
android:layout_width="fill_parent"
android:layout_height="fill_parent"
android:layout_centerInParent="true" />
<LinearLayout
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_alignParentBottom="true"
android:layout_margin="16dp"
android:orientation="horizontal">
<Button
android:id="#+id/btnPlay"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="Play" />
<Button
android:id="#+id/btnPause"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="Pause" />
<Button
android:id="#+id/btnStop"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="Stop" />
</LinearLayout>
</RelativeLayout>
Output of the code for center crop look like
So I wasn't yet able to get all scale types that you've asked but I've been able to get fit-xy and center-crop working fairly easily using exo player. The full code can be seen at https://github.com/yperess/StackOverflow/tree/50091878 and I'll update it as I get more. Eventually I'll also fill the MainActivity to allow you to choose the scaling type as the settings (I'll do this with a simple PreferenceActivity) and read the shared preferences value on the service side.
The overall idea is that deep down MediaCodec already implements both fit-xy and center-crop which are really the only 2 modes you would need if you had access to a view hierarchy. This is the case because fit-center, fit-top, fit-bottom would all really just be fit-xy where the surface has a gravity and is scaled to match the video size * minimum scaling. To get these working what I believe will need to happen is we'd need to create an OpenGL context and provide a SurfaceTexture. This SurfaceTexture can be wrapped with a stub Surface which can be passed to exo player. Once the video is loaded we can set the size of these since we created them. We also have a callback on SurfaceTexture to let us know when a frame is ready. At this point we should be able to modify the frame (hopefully just using a simple matrix scale and transform).
The key components here are creating the exo player:
private fun initExoMediaPlayer(): SimpleExoPlayer {
val videoTrackSelectionFactory = AdaptiveTrackSelection.Factory(bandwidthMeter)
val trackSelector = DefaultTrackSelector(videoTrackSelectionFactory)
val player = ExoPlayerFactory.newSimpleInstance(this#MovieLiveWallpaperService,
trackSelector)
player.playWhenReady = true
player.repeatMode = Player.REPEAT_MODE_ONE
player.volume = 0f
if (mode == Mode.CENTER_CROP) {
player.videoScalingMode = C.VIDEO_SCALING_MODE_SCALE_TO_FIT_WITH_CROPPING
} else {
player.videoScalingMode = C.VIDEO_SCALING_MODE_SCALE_TO_FIT
}
if (mode == Mode.FIT_CENTER) {
player.addVideoListener(this)
}
return player
}
Then loading the video:
override fun onSurfaceCreated(holder: SurfaceHolder) {
super.onSurfaceCreated(holder)
if (mode == Mode.FIT_CENTER) {
// We need to somehow wrap the surface or set some scale factor on exo player here.
// Most likely this will require creating a SurfaceTexture and attaching it to an
// OpenGL context. Then for each frame, writing it to the original surface but with
// an offset
exoMediaPlayer.setVideoSurface(holder.surface)
} else {
exoMediaPlayer.setVideoSurfaceHolder(holder)
}
val videoUri = RawResourceDataSource.buildRawResourceUri(R.raw.small)
val dataSourceFactory = DataSource.Factory { RawResourceDataSource(context) }
val mediaSourceFactory = ExtractorMediaSource.Factory(dataSourceFactory)
exoMediaPlayer.prepare(mediaSourceFactory.createMediaSource(videoUri))
}
UPDATE:
Got it working, I'll need tomorrow to clean it up before I post the code but here's a sneak preview...
What I ended up doing it basically taking GLSurfaceView and ripping it apart. If you look at the source for it the only thing missing that's making it impossible to use in a wallpaper is the fact that it only starts the GLThread when attached to the window. So if you replicate the same code but allow to manually start the GLThread you can go ahead. After that you just need to keep track of how big your screen is vs the video after scaling to the minimum scale that would fit and shift the quad on which you draw to.
Known issues with the code:
1. There's a small bug with the GLThread I haven't been able to fish out. Seems like there's a simple timing issue where when the thread pauses I get a call to signallAll() that's not actually waiting on anything.
2. I didn't bother dynamically modifying the mode in the renderer. It shouldn't be too hard. Add a preference listener when creating the Engine then update the renderer when scale_type changes.
UPDATE:
All issues have been resolved. signallAll() was throwing because I missed a check to see that we actually have the lock. I also added a listener to update the scale type dynamically so now all scale types use the GlEngine.
ENJOY!
I find this article: How to set video as live wallpaper and keep video aspect ratio(width and height)
above article has simple source, just click "set wallpaper" button, if you wanna full feature app, see https://github.com/AlynxZhou/alynx-live-wallpaper
the keypoint is use glsurfaceview instead of wallpaperservice default surfaceview, make custom glsurfaceview renderer, glsurfaceview can use opengl to display, so the question become "how to use glsurfaceview play video" or "how to use opengl play video"
how to use glsurfaceview instead of wallpaperservice default surfaceview:
public class GLWallpaperService extends WallpaperService {
...
class GLWallpaperEngine extends Engine {
...
private class GLWallpaperSurfaceView extends GLSurfaceView {
#SuppressWarnings("unused")
private static final String TAG = "GLWallpaperSurface";
public GLWallpaperSurfaceView(Context context) {
super(context);
}
/**
* This is a hack. Because Android Live Wallpaper only has a Surface.
* So we create a GLSurfaceView, and when drawing to its Surface,
* we replace it with WallpaperEngine's Surface.
*/
#Override
public SurfaceHolder getHolder() {
return getSurfaceHolder();
}
void onDestroy() {
super.onDetachedFromWindow();
}
}
my solution is use gif(size and fps same with video) instead of video in live wallpaper
see my answer: https://stackoverflow.com/a/60425717/6011193, WallpaperService can best fit gif
convert video to gif in computer with ffmpeg
or in android, video can be converted to gif in android code: see https://stackoverflow.com/a/16749143/6011193
You can use Glide for GIF and image loading and its give scaling options as you like. Based on document https://bumptech.github.io/glide/doc/targets.html#sizes-and-dimensions and https://futurestud.io/tutorials/glide-image-resizing-scaling this.
Glide v4 requires Android Ice Cream Sandwich (API level 14) or higher.
Like :
public static void loadCircularImageGlide(String imagePath, ImageView view) {
Glide.with(view.getContext())
.load(imagePath)
.asGif()
.override(600, 200) // resizes the image to these dimensions (in pixel). resize does not respect aspect ratio
.error(R.drawable.create_timeline_placeholder)
.fitCenter() // scaling options
.transform(new CircularTransformation(view.getContext())) // Even you can Give image tranformation too
.into(view);
}

Why the beep sound makes app crashes?

I try to make a level indicator which beeps quicker when the phone reach an horizontal position (using the accelerometer), the beep sound plays after a touch on the screen and stop if screen is touched again.
So I made the following code, and the problem I have is that since I've add the "beep" sound part (with MyRunnable function called in the onTouch event), my app crashes after few seconds (it works fine for few second and I have no error messages after the build).
I really have no clue on what the problem could be. I'm stuck here and need some help, thanks !
public class MainActivity extends AppCompatActivity implements SensorEventListener, View.OnTouchListener {
Sensor mySensor;
SensorManager SM;
float ANGLEX, ANGLEY, ANGLEZ;
int aX, aY, aZ;
int roundANGLEX, roundANGLEY, roundANGLEZ;
int Xetal, Yetal;
double centre;
int Rcentre;
int Rcentre2;
boolean active;
int test;
int i=0;
private Handler myHandler;
private Runnable myRunnable = new Runnable() {
#Override
public void run() {
// Play a periodic beep which accelerates according to Rcentre2
ToneGenerator toneGen1 = new ToneGenerator(AudioManager.STREAM_MUSIC, 100);
toneGen1.startTone(ToneGenerator.TONE_PROP_BEEP,150);
myHandler.postDelayed(this,(long) Math.sqrt(Rcentre2*20)+50);
}
};
#RequiresApi(api = Build.VERSION_CODES.KITKAT)
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
//Créée notre Sensor Manager
SM = (SensorManager) getSystemService(SENSOR_SERVICE);
//Accéléromètre
mySensor = SM.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
//Registre Sensor Listener
SM.registerListener(this, mySensor, 1000000, 1000000);
((ConstraintLayout) findViewById(R.id.layout_main)).setOnTouchListener((View.OnTouchListener) this);
}
#Override
public void onAccuracyChanged(Sensor sensor, int i) {
//Pas utilisé
}
#Override
public void onSensorChanged(SensorEvent sensorEvent) {
float z_stable = ((int) ((sensorEvent.values[2]) * 100)) / 100.0f;
ANGLEX = (float) (((float) (Math.acos(sensorEvent.values[0] / 9.807)) * (180 / Math.PI))); //I get the accelerometer's values
ANGLEY = (float) (((float) (Math.acos(sensorEvent.values[1] / 9.807)) * (180 / Math.PI)));
ANGLEZ = (float) (((float) (Math.acos(z_stable / 9.807)) * (180 / Math.PI)));
roundANGLEX = Math.round(ANGLEX);
roundANGLEY = Math.round(ANGLEY);
roundANGLEZ = Math.round(ANGLEZ);
aX = roundANGLEX;
aY = roundANGLEY;
aZ = roundANGLEZ;
Xetal = aX - 88; //Xetal and Yetal are 0 when phone is on plane surface
Yetal = aY - 90; //and go from -90 to +90
centre = Math.sqrt(Xetal * Xetal + Yetal * Yetal); //gives the "distance" from the "center => the smaller centre gets, the closer the phone approach horizontal
Rcentre = (int) Math.round(centre);
Rcentre2 = (int) Math.round(centre * 100);
}
public boolean onTouch(View view, MotionEvent motionEvent) {
if (active == true) {
active = false;
myHandler.removeCallbacks(myRunnable);
}
else if (active == false) {
active = true;
myHandler = new Handler();
myHandler.postDelayed(myRunnable,0);
}
return false;
}
}
Here are the Logcat infos, it seems to shows some problems but I don't know what that means.
logcat information
I found an answer that works for me at ToneGenerator crashes in android 6.0
"it was just about releasing created objects of ToneGenerator because rapidly creating objects of 'ToneGenerator' without releasing them will cause the application to crash."
ToneGenerator toneGen1 = new ToneGenerator(AudioManager.STREAM_MUSIC, 100);
toneGen1.startTone(ToneGenerator.TONE_PROP_BEEP,150);
toneGen1.release();
I add toneGen1.release(); and it now works fine.
credits to Mahmoud Farahat.
With a semaphore (tonebusy) that seem working (KOTLIN) :
var tonebusy : Boolean = false
fun beep(i : Int) {
if (tonebusy) {
return
}
else
{
tonebusy = true
val toneGen1 = ToneGenerator(AudioManager.STREAM_MUSIC, 100)
toneGen1.startTone(ToneGenerator.TONE_CDMA_PIP, i)
toneGen1.release();
}
tonebusy = false
}

Android: starting service from a Utility class other than activity

i know that services can be started from Activity as below
public class MainActivity extends Activity {
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
}
// Method to start the service
public void startService(View view) {
startService(new Intent(getBaseContext(), MyService.class));
}
}
as the startService( ) method is there in Activity class, i am thinking its not possible to call service from any java class which is not extending activity class...
if there any way we can start the service from a normal/Utility class, plz let me know??
EDIT: i have tried below suggestion as ,
package com.genedevelopers.shootthedevil;
import android.app.Activity;
import android.content.ComponentName;
import android.content.Context;
import android.content.Intent;
import android.content.ServiceConnection;
import android.graphics.Canvas;
import android.graphics.Rect;
import android.os.IBinder;
public class Devil {
// This are starting data.
public static final float initSpeed = 5;
public static final long initTimeBetweenDucks = 1800; // in milliseconds
public static Context dctx;
private boolean mIsBound = false;
// This is current speed that will be increased and current time that will be decreased.
public static float speed;
public static long timeBetweenDucks; // in milliseconds
public static long timeOfLastDuck;
public static boolean direction = true;
// Needed for speeding up the game
public static long timeBetweenSpeedups = 250; // in milliseconds
public static long timeOfLastSpeedup;
// Devil position on the screen.
public float x;
public float y;
// Speed and direction.
private float velocity;
//MusicService musicS;
//For background Music start
private MusicService2 mServ;
private ServiceConnection Scon =new ServiceConnection(){
public void onServiceConnected(ComponentName name, IBinder
binder) {
mServ = ((MusicService2.ServiceBinder)binder).getService();
}
public void onServiceDisconnected(ComponentName name) {
mServ = null;
}
};
void doBindService(){
dctx.bindService(new Intent(dctx,MusicService2.class),
Scon, Context.BIND_AUTO_CREATE);
mIsBound = true;
}
void doUnbindService()
{
if(mIsBound)
{
dctx.unbindService(Scon);
mIsBound = false;
}
}
//For background Music end
public Devil(int y){
this.y = y;
if(Devil.direction){
this.x = Game.screenWidth;
velocity = speed * -1;
} else {
this.x = 0 - Game.duckImage.getWidth();
velocity = speed;
}
doBindService();
// We change direction for a next devil.
Devil.direction = !Devil.direction;
dctx=HighScore.ctx;
}
/**
* Move the devil.
*/
public void update(){
this.x += velocity;
}
/**
* Draw the devil to a screen.
*
* #param canvas Canvas to draw on.
*/
public void draw(Canvas canvas){
// musicS=new MainMenu().getMusicServiceInstance();
if(velocity < 0)
canvas.drawBitmap(Game.devilImage, x, y, null);
else
canvas.drawBitmap(Game.devilRightImage, x, y, null);
}
/**
* Checks if the devil was touched/shoot.
*
* #param touchX X coordinate of the touch.
* #param touchY Y coordinate of the touch.
*
* #return True if touch coordinates are in the coordinates of devil rectangle, false otherwise.
*/
public boolean wasItShoot(int touchX, int touchY){
Rect devilRect = new Rect((int)this.x, (int)this.y, (int)this.x + Game.devilImage.getWidth(), (int)this.y + Game.devilImage.getHeight());
if(duckRect.equals(true)){
Intent music = new Intent();
music.setClass(dctx,MusicService2.class);
dctx.startService(music);
}
return duckRect.contains(touchX, touchY);
}
}
but it is not working please help me...
you can start it if you pass the context to the class (e.g. in constructor)
context.startService(intent))
In theory you can, but you need the Context to start a service. The Context usually is an Activity or a Service (What is 'Context' on Android?). You can pass a reference of Context to the utility class and start the service from there.
startService is a method of Context not Activity. As long as you have a context you can start service using it.
You can do as follows:
public class MyApp extends Application {
public static MyApp instance;
public void onCreate() {
super.onCreate()
instance = this;
}
}
Then from any place you can do MyApp.instance.startService(...).
If you do so make sure you register your app class in the manifest.
Hi thanku you all for ur replay...its working now...simple mistake if(duckRect.equals(true)){ } was never true so it was notcaling the service.

Can't hear sound Android Simulator Mac

I can't hear any sound in my Android Simulator in my Mac. Some times, verily little time I can hear it. I give you my code of my Sound Manager. But i think its configuration problems
I copy the code from a page that they said it works. And i already try about 5 different codes and none works. My sound are in WAV game: 100k menu 800k
Thanks
import java.util.HashMap;
import android.content.Context;
import android.media.AudioManager;
import android.media.SoundPool;
public class SoundManager {
static private SoundManager _instance;
private static SoundPool mSoundPool;
private static HashMap<Integer, Integer> mSoundPoolMap;
private static AudioManager mAudioManager;
private static Context mContext;
private SoundManager()
{
}
/**
* Requests the instance of the Sound Manager and creates it
* if it does not exist.
*
* #return Returns the single instance of the SoundManager
*/
static synchronized public SoundManager getInstance()
{
if (_instance == null)
_instance = new SoundManager();
return _instance;
}
/**
* Initialises the storage for the sounds
*
* #param theContext The Application context
*/
public static void initSounds(Context theContext)
{
mContext = theContext;
mSoundPool = new SoundPool(4, AudioManager.STREAM_MUSIC, 0);
mSoundPoolMap = new HashMap<Integer, Integer>();
mAudioManager = (AudioManager)mContext.getSystemService(Context.AUDIO_SERVICE);
}
/**
* Add a new Sound to the SoundPool
*
* #param Index - The Sound Index for Retrieval
* #param SoundID - The Android ID for the Sound asset.
*/
public static void addSound(int Index,int SoundID)
{
mSoundPoolMap.put(Index, mSoundPool.load(mContext, SoundID, 1));
}
/**
* Loads the various sound assets
* Currently hard coded but could easily be changed to be flexible.
*/
public static void loadSounds()
{
mSoundPoolMap.put(1, mSoundPool.load(mContext, R.raw.menu, 1));
mSoundPoolMap.put(2, mSoundPool.load(mContext, R.raw.game, 1));
}
/**
* Plays a Sound
*
* #param index - The Index of the Sound to be played
* #param speed - The Speed to play not, not currently used but included for compatibility
*/
public static void playSound(int index,float speed)
{
float streamVolume = mAudioManager.getStreamVolume(AudioManager.STREAM_MUSIC);
streamVolume = streamVolume / mAudioManager.getStreamMaxVolume(AudioManager.STREAM_MUSIC);
mSoundPool.play(mSoundPoolMap.get(index), streamVolume, streamVolume, 1, 0, speed);
}
/**
* Stop a Sound
* #param index - index of the sound to be stopped
*/
public static void stopSound(int index)
{
mSoundPool.stop(mSoundPoolMap.get(index));
}
public static void cleanup()
{
mSoundPool.release();
mSoundPool = null;
mSoundPoolMap.clear();
mAudioManager.unloadSoundEffects();
_instance = null;
}
}
Basically, you have to set a flag in the config settings to tell it what kind of audio to output, and what device. It's a pain. I wrote about it here:
http://www.banane.com/2012/01/11/joy-of-android-playing-sound/
That worked on my old MacBook. Just upgraded to a Mac Air and the solution has changed, not sure, but -audio is the solution now. I'll post here when I get it working.
I SOLVED IT,
Just re-install all the eclipse and SDK. I try in another computer and works fine. The same code

Categories

Resources