Send Android TextToSpeech to just one stereo channel - android

On Android, I want to play TextToSpeech output through only one sound channel (think Shoulder Angel). To do this, I am currently using tts.synthesizeToFile(), and then playing back the dynamically-created file using the MediaPlayer. I use mediaPlayer.setVolume(0.0f, 1.0f) to play the audio through only one channel.
My working code is below.
My question is: is there a more direct way of playing TTS output through a single channel?
Using TextToSpeech to synthesize the file is time-consuming, and using MediaPlayer to play it back uses more resources than strictly necessary. I want this to be responsive and to work on low-end devices, so being kind to the CPU is important.
MainActivity.java
package com.example.pantts;
import android.app.Activity;
import android.media.AudioManager;
import android.media.MediaPlayer;
import android.speech.tts.TextToSpeech;
import android.os.Bundle;
import android.speech.tts.UtteranceProgressListener;
import android.util.Log;
import java.io.File;
import java.io.FileDescriptor;
import java.io.FileInputStream;
import java.util.HashMap;
import java.util.Locale;
public class MainActivity extends Activity implements TextToSpeech.OnInitListener {
private TextToSpeech tts;
private String toSpeak = "Hello, right ear!";
private static final String FILE_ID = "file";
private HashMap<String, String> hashMap = new HashMap<String, String>();
private String filename;
private TextToSpeech tts;
private MediaPlayer mediaPlayer;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
filename = getFilesDir() + "/" + "tts.wav";
Log.d("LOG", "file: " + filename);
// /data/data/com.example.pantts/files/tts.wav
mediaPlayer = new MediaPlayer();
mediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
tts = new TextToSpeech(this, this);
tts.setOnUtteranceProgressListener(mProgressListener);
}
public void onInit(int status) {
if (status == TextToSpeech.SUCCESS) {
tts.setLanguage(Locale.UK);
hashMap.put(TextToSpeech.Engine.KEY_PARAM_UTTERANCE_ID, FILE_ID);
// Using deprecated call for API 20 and earlier
tts.synthesizeToFile(toSpeak, hashMap, filename);
Log.d("LOG", "synthesizeToFile queued");
}
}
private UtteranceProgressListener mProgressListener =
new UtteranceProgressListener() {
#Override
public void onStart(String utteranceId) {
Log.d("LOG", "synthesizeToFile onStart " + utteranceId);
}
#Override
public void onError(String utteranceId) {
Log.d("LOG", "synthesizeToFile onError " + utteranceId);
}
#Override
public void onDone(String utteranceId) {
if (utteranceId.equals(FILE_ID)) { // Thanks to Hoan Nguyen for correcting this
Log.d("LOG", "synthesizeToFile onDone " + utteranceId);
try {
File ttsFile = new File(filename);
FileInputStream inputStream = new FileInputStream(ttsFile);
FileDescriptor fileDescriptor = inputStream.getFD();
mediaPlayer.reset();
mediaPlayer.setDataSource(fileDescriptor);
inputStream.close();
mediaPlayer.prepare();
mediaPlayer.setVolume(0.0f, 1.0f); // right channel only
mediaPlayer.start();
} catch (Exception e) {
e.printStackTrace();
}
}
}
};
}

There is nothing wrong with the synthesize, it is the comparison that is wrong. It should be
if (utteranceId.equals(FILE_ID))

Related

Making application work normally when user turns off screen android

I'm working on an application that which is media player.
Question: How do I make media player (application) work without issues from turning the screen off?
Question: loadInBackground() returns the uri but onLoadFinished not called when screen off.
Few words to explain trouble better:
The media player contains Loader which loads the song and another Loader which loads related suggestions. I've also implemented the method to play_next() which relies on a listener of media player on finished (button in right upper corner).
The media player is initialized in the service class which I've made so the user can search new songs, and prepare the next_song() with the button (and the playing continues because I connect to service each time Activity is loaded and I return from service media player so I can attach listener for onFinish method).
The thing that bothers me is that when the user turns off the screen, the activity goes to idle state (status from android monitor - log cat) and once in idle state (aka turned off screen) if the song ends, it will start new intent which is media player to start initializing and auto-playing song. It works when the screen is on but it doesn't if it goes to idle state.
If I turn on the screen I get activity to act like this:
Little pink dot is a progress bar. So the activity tries to refresh itself?
In the onCreate() method I call start_loader which initializes and does the things with Loader.
I've seen some power manager tools and seen the bad comments about it which imply to the battery usage but I did try it and from log cat it just went to idle state again (if it matters).
Help please, maybe if I override onPause() Activity and onResume()?
Also i get message from loadInBackground() which is uri from song and from there on it freezes doesn't continue.
you need to create Service for that that run on Background.so when you play song don'tstop if you keep you screen off.Service in Android
upper link perfectly describe service.
example of service ...
import android.app.Service;
import android.content.BroadcastReceiver;
import android.content.Context;
import android.content.Intent;
import android.content.IntentFilter;
import android.content.SharedPreferences;
import android.content.res.Configuration;
import android.media.MediaPlayer;
import android.net.Uri;
import android.os.Build;
import android.os.IBinder;
import android.support.annotation.RequiresApi;
import android.util.Log;
import android.widget.Toast;
import com.example.divyesh.musiclite.Pojos.SongsList;
import java.io.File;
import java.io.IOException;
/**
* Created by Divyesh on 11/18/2017.
*/
public class MediaSongServiece extends Service {
SongsList s;
private static Boolean destroy = false;
private String TAG = "HELLO";
private MusicIntentReceiver reciever;
private SharedPreferences prefrence;
private static MediaPlayer player;
private int thisStartId = 1;
private String ss[];
SharedPreferences.Editor editor;
public IBinder onBind(Intent arg0) {
return null;
}
public static void requestPlayMedia() {
player.start();
}
public void requestPauseMedia() {
player.pause();
}
#Override
public void onCreate() {
super.onCreate();
reciever = new MusicIntentReceiver();
IntentFilter filter = new IntentFilter(Intent.ACTION_HEADSET_PLUG);
registerReceiver(reciever, filter);
Log.d("service", "onCreate");
}
#Override
public void onTaskRemoved(Intent rootIntent) {
super.onTaskRemoved(rootIntent);
stopSelf();
}
#Override
public boolean onUnbind(Intent intent) {
stopSelf(thisStartId);
return super.onUnbind(intent);
}
public void onStart(Intent intent, int startId) {
if (intent.equals(null)) {
stopSelf();
}
if (destroy == false) {
thisStartId = startId;
ss = intent.getExtras().getStringArray("getArray");
Log.e(TAG, "onStart: " + ss[0] + ss[1] + " path" + ss[5]);
s = new SongsList();
s.setAll(ss);
if (player != null) {
player.stop();
player.reset();
try {
player.setDataSource(getApplicationContext(), Uri.fromFile(new File(s.getPath())));
player.prepare();
} catch (IOException e) {
e.printStackTrace();
}
player.setLooping(true);
player.setVolume(100, 100);
player.start();
Log.e(TAG, "onStart: m= is not null" + player.isPlaying());
} else {
player = MediaPlayer.create(getApplicationContext(), Uri.fromFile(new File(s.getPath())));
player.setLooping(true);
player.setVolume(100, 100);
player.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {
#Override
public void onPrepared(MediaPlayer mediaPlayer) {
mediaPlayer.start();
}
});
Log.e(TAG, "onStart: m= is null WE created new player");
}
} else {
Log.e(TAG, "onelse destroy ");
recover();
}
}
private void recover() {
destroy = false;
prefrence = getSharedPreferences("SongPrefrence", Context.MODE_PRIVATE);
for (int i = 0; i <= 5; i++) {
ss[i] = prefrence.getString("" + i, "");
}
String currentPose = prefrence.getString("current_pos", "");
Log.e(TAG, "recover: Shared Daata is" + ss[5] + "_______" + currentPose);
}
#Override
public void onDestroy() {
unregisterReceiver(reciever);
player.stop();
player.release();
stopSelf(thisStartId);
}
#Override
public void onLowMemory() {
}
#Override
public void onConfigurationChanged(Configuration newConfig) {
super.onConfigurationChanged(newConfig);
if (newConfig.orientation == Configuration.ORIENTATION_LANDSCAPE) {
prefrence = getSharedPreferences("SongPrefrence", Context.MODE_PRIVATE);
editor = prefrence.edit();
destroy = true;
}
if (newConfig.orientation == Configuration.ORIENTATION_PORTRAIT) {
prefrence = getSharedPreferences("SongPrefrence", Context.MODE_PRIVATE);
editor = prefrence.edit();
destroy = true;
}
}
#RequiresApi(api = Build.VERSION_CODES.M)
private void saveData() {
player.pause();
for (int i = 0; i < ss.length; i++) {
editor.putString("" + i, ss[i]);
}
editor.putString("current_pos", "" + player.getCurrentPosition());
editor.commit();
}
public class MusicIntentReceiver extends BroadcastReceiver {
public String TAG = "ss";
#Override
public void onReceive(Context context, Intent intent) {
if (intent.getAction().equals(Intent.ACTION_HEADSET_PLUG)) {
int state = intent.getIntExtra("state", -1);
switch (state) {
case 0:
Log.d(TAG, "Headset is unplugged");
Toast.makeText(context, " Headset is unpluged ", Toast.LENGTH_SHORT).show();
Log.e(TAG, "onReceive: " + " is play song " + player.isPlaying());
break;
case 1:
Log.d(TAG, "Headset is plugged");
break;
default:
Log.d(TAG, "I have no idea what the headset state is");
}
}
}
}
}

Android AcousticEchoCanceler does not cancel out echo

I've been having a fairly annoying problem with a video chat app I'm developing, and that's the issue of audio echoing.
I am at best a rank amateur at this, but the project I'm working on requires at least fully functional audio communication. Video turned out to be a lot easier than I originally anticipated.
The intended structure is eventually a thread taking input and another playing output on the same phone, for developing this, I've made two small apps that take in mic input on one phone, and send it via Datagram socket to the other. The phones in question are LG Optimus L7-2 running Android 4.1.2 and Alcatel Idol Mini (I think it's also advertized as Onetouch or some such.) running Android 4.2.2.
The code that transfers audio works perfectly, with minimal background noise (I'm guessing thanks to my choice of input as well as the post processing), however, as long as the two phones are close enough, I get a rather alarming echo, which is only made worse if I dare attempt to put input/output in the same app at the same time.
After my initial attempts at somehow filtering it out failed (AcousticEchoCanceler seems to help less than NoiseSupressor, and AutomaticGainControl seems to do more damage than good), I've done a bit of reading but found nothing that could help.
I am at this point rather confused as I can't seem to shake the feeling that I'm missing something obvious, and that it shouldn't be THAT complicated to set up.
I'm in addition putting up the base code I'm using for the audio recording/playing.
The recorder segment
package com.example.audiotest;
import java.io.IOException;
import java.io.InputStream;
import java.net.DatagramPacket;
import java.net.DatagramSocket;
import java.net.InetAddress;
import java.net.ServerSocket;
import java.net.Socket;
import java.net.UnknownHostException;
import android.app.Activity;
import android.media.AudioFormat;
import android.media.AudioRecord;
import android.media.AudioTrack;
import android.media.MediaRecorder;
import android.media.audiofx.AcousticEchoCanceler;
import android.media.audiofx.AutomaticGainControl;
import android.media.audiofx.NoiseSuppressor;
import android.os.Bundle;
import android.util.Log;
import android.view.View;
import android.view.View.OnClickListener;
import android.widget.Button;
public class MainActivity extends Activity {
private Button startButton,stopButton;
public byte[] buffer;
public static DatagramSocket socket;
private int port=50005;
AudioRecord recorder;
private int sampleRate = 22050;
private int channelConfig = AudioFormat.CHANNEL_IN_MONO;
private int audioFormat = AudioFormat.ENCODING_PCM_16BIT;
private int minBufSize = AudioRecord.getMinBufferSize(sampleRate, channelConfig, audioFormat);
private boolean status = true;
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
startButton = (Button) findViewById (R.id.start_button);
stopButton = (Button) findViewById (R.id.stop_button);
startButton.setOnClickListener (startListener);
stopButton.setOnClickListener (stopListener);
Log.v("AudioPlayerApp","minBufSize: " + minBufSize);
//minBufSize += 2048;
minBufSize = 4096;
System.out.println("minBufSize: " + minBufSize);
}
private final OnClickListener stopListener = new OnClickListener() {
#Override
public void onClick(View arg0) {
status = false;
recorder.release();
Log.d("VS","Recorder released");
}
};
private final OnClickListener startListener = new OnClickListener() {
#Override
public void onClick(View arg0) {
status = true;
startStreaming();
}
};
public void startStreaming() {
Thread streamThread = new Thread(new Runnable() {
#Override
public void run() {
try {
DatagramSocket socket = new DatagramSocket();
Log.d("AudioPlayerApp"", "Socket Created");
minBufSize = 4096;
byte[] buffer = new byte[minBufSize];
Log.d("AudioPlayerApp","Buffer created of size " + minBufSize);
DatagramPacket packet;
final InetAddress destination = InetAddress.getByName("192.168.0.13");
recorder = new AudioRecord(MediaRecorder.AudioSource.VOICE_RECOGNITION,sampleRate,channelConfig,audioFormat,minBufSize);
AcousticEchoCanceler canceler = AcousticEchoCanceler.create(recorder.getAudioSessionId());
NoiseSuppressor ns = NoiseSuppressor.create(recorder.getAudioSessionId());
AutomaticGainControl agc = AutomaticGainControl.create(recorder.getAudioSessionId());
canceler.setEnabled(true);
ns.setEnabled(true);
//agc.setEnabled(true);
recorder.startRecording();
while(status == true) {
//reading data from MIC into buffer
minBufSize = recorder.read(buffer, 0, buffer.length);
//putting buffer in the packet
packet = new DatagramPacket (buffer,buffer.length,destination,port);
socket.send(packet);
}
} catch(UnknownHostException e) {
Log.e("AudioPlayerApp", "UnknownHostException");
} catch (IOException e) {
e.printStackTrace();
Log.e("AudioPlayerApp", "IOException");
}
}
});
streamThread.start();
}
}
And the player segment.
package com.test.playsound;
import java.net.DatagramPacket;
import java.net.DatagramSocket;
import java.net.ServerSocket;
import java.net.Socket;
import android.media.AudioFormat;
import android.media.AudioManager;
import android.media.AudioTrack;
import android.os.Bundle;
import android.app.Activity;
import android.util.Log;
import android.view.Menu;
public class MainActivity extends Activity {
static int port = 50005;
static String address = "";
static int sampleRate = 22050;
private boolean running = true;
private AudioTrack audioTrack;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
Log.v("Player", "Init complete");
openPlaySocket();
}
private void openPlaySocket() {
// TODO Auto-generated method stub
Thread t = new Thread(new Runnable() {
#Override
public void run() {
// TODO Auto-generated method stub
try {
Log.v("AudioPlayerApp", "Opening socket");
DatagramSocket sSock = new DatagramSocket(port);
byte[] output = new byte[4096];
Log.v("AudioPlayerApp", "Generating AudioTrack");
int minBufferSize = AudioTrack.getMinBufferSize(sampleRate,
AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_16BIT);
audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC,
sampleRate, AudioFormat.CHANNEL_OUT_MONO,
AudioFormat.ENCODING_PCM_16BIT, minBufferSize,
AudioTrack.MODE_STREAM);
DatagramPacket receivePacket = new DatagramPacket(output,
output.length);
//Log.v("AudioPlayerApp", "Playing AudioTrack");
audioTrack.play();
while (running) {
//Log.v("AudioPlayerApp", "Waiting Packet");
sSock.receive(receivePacket);
Log.v("AudioPlayerApp","REcieved packet");
//Log.v("AudioPlayerApp", "Packet recieved");
try {
//Log.v("AudioPlayerApp", "writing data to audioTrack");
audioTrack.write(receivePacket.getData(), 0,
receivePacket.getData().length);
} catch (Exception e) {
Log.v("AudioPlayerApp",
"Failed to write audio: " + e.getMessage());
}
}
/*Log.v("AudioPlayerApp","Opening socket");
ServerSocket sSock = new ServerSocket(port);
Socket sock = sSock.accept();
Log.v("AudioPlayerApp","Socket opened "+port);
*/
} catch (Exception e) {
// TODO: handle exception
Log.v("AudioPlayerApp", "Error: " + e.getMessage());
}
}
});
Log.v("Player", "Starting thread");
t.start();
}
#Override
public boolean onCreateOptionsMenu(Menu menu) {
// Inflate the menu; this adds items to the action bar if it is present.
getMenuInflater().inflate(R.menu.activity_main, menu);
return true;
}
}
I'm aware it contains bad practices (such as not checking whether the device in question has support for certain things, or releasing resources and such) however, this was in an effort to start testing out and fixing the echo as fast as possible. I've confirmed that both phones have access to AcousticEchoCanceller, NoiseSupression, recording rights, internet rights, and aside for the fact that AudioFormat.VOICECOMMUNICATION causes my AudioRecord to crash, I've had no other problems.
I'm looking for any ideas or advice on the subject, as I'm quite frankly stumped. What can be done to fix the issue of echoing while recording and playing voice?
the AcousticEchoCanceler class is for canceling or removing the audio played by speaker and captured by microphone of the same device and there is small delay between play and capture.
the AcousticEchoCanceler class can not remove the echo caused by placing two phone near each other because of long and variable nature of echo delay of echo path.
You need to leverage the build in echo cancellation at the hardware level, check if AcousticEchoCanceler.isAvailable() check and return true
Then you can try the following combinations from here on SO

Open gallery in specific video, without playing it

I have a camera application that can also record video. (Im developing on samsung S3)
I want to be able to open the gallery on the last recorded video.
I use this code:
Intent intent = new Intent(Intent.ACTION_VIEW);
intent.setDataAndType(Uri.parse(file.getAbsolutePath()), "video/3gpp");
startActivity(intent);
The problem with that code is that the video immediately starts, and when it ends
the gallery activity close.
I want to be able to open the video without playing it, exactly like in my samsung S3.
thanks in advance!
To open particular image we can use this .. and its worked .
so please check with your requirement. Hope this will helps you..
import java.io.File;
import android.app.Activity;
import android.content.Intent;
import android.media.MediaScannerConnection;
import android.media.MediaScannerConnection.MediaScannerConnectionClient;
import android.net.Uri;
import android.os.Bundle;
import android.os.Environment;
import android.util.Log;
import android.view.View;
import android.view.View.OnClickListener;
import android.widget.Button;
public class SDCARD123Activity extends Activity implements MediaScannerConnectionClient{
public String[] allFiles;
private String SCAN_PATH ;
private static final String FILE_TYPE="image/*";
private MediaScannerConnection conn;
/** Called when the activity is first created. */
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
File folder = new File("/sdcard/Photo/");
allFiles = folder.list();
// uriAllFiles= new Uri[allFiles.length];
for(int i=0;i<allFiles.length;i++)
{
Log.d("all file path"+i, allFiles[i]+allFiles.length);
}
// Uri uri= Uri.fromFile(new File(Environment.getExternalStorageDirectory().toString()+"/yourfoldername/"+allFiles[0]));
SCAN_PATH=Environment.getExternalStorageDirectory().toString()+"/Photo/"+allFiles[0];
System.out.println(" SCAN_PATH " +SCAN_PATH);
Log.d("SCAN PATH", "Scan Path " + SCAN_PATH);
Button scanBtn = (Button)findViewById(R.id.scanBtn);
scanBtn.setOnClickListener(new OnClickListener(){
#Override
public void onClick(View v) {
startScan();
}});
}
private void startScan()
{
Log.d("Connected","success"+conn);
if(conn!=null)
{
conn.disconnect();
}
conn = new MediaScannerConnection(this,this);
conn.connect();
}
#Override
public void onMediaScannerConnected() {
Log.d("onMediaScannerConnected","success"+conn);
conn.scanFile(SCAN_PATH, FILE_TYPE);
}
#Override
public void onScanCompleted(String path, Uri uri) {
try {
Log.d("onScanCompleted",uri + "success"+conn);
System.out.println("URI " + uri);
if (uri != null)
{
Intent intent = new Intent(Intent.ACTION_VIEW);
intent.setData(uri);
startActivity(intent);
}
} finally
{
conn.disconnect();
conn = null;
}
}
}

exception determination how to know what to catch

I do understand the basics of try and catch in as much as you try some code and look for errors that occur and catch them and then do something based on the error. I have code that when run looks for a complete video file exists on the SD card it plays the video if it is not complete it downloads it to the SD card then I want it to play.
here is my code block
import java.io.BufferedInputStream;
import java.io.File;
import java.io.FileOutputStream;
import java.io.InputStream;
import java.io.OutputStream;
import java.net.URL;
import java.net.URLConnection;
import android.app.Activity;
import android.app.Dialog;
import android.app.ProgressDialog;
import android.content.Context;
import android.content.Intent;
import android.graphics.PixelFormat;
import android.net.Uri;
import android.os.AsyncTask;
import android.os.Bundle;
import android.os.Environment;
import android.util.Log;
import android.view.Window;
import android.widget.MediaController;
import android.widget.Toast;
import android.widget.VideoView;
public class VideoActivity extends Activity {
private static final String TAG = "MyActivity";
public static final int DIALOG_DOWNLOAD_PROGRESS = 0;
public static final Context ACTION_VIEW = null;
private ProgressDialog mProgressDialog;
public String url = "";
public String fName = "";
public String vidName = "";
public String path="";
//final VideoView videoView = (VideoView) findViewById(R.id.videoView1);
/** Called when the activity is first created. */
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
requestWindowFeature(Window.FEATURE_NO_TITLE);
final String[] myAPP_FILES = getResources().getStringArray(R.array.APP_FILES);
final String[] myAPP_FILENAMES = getResources().getStringArray(R.array.APP_FILENAMES);
final String[] myAPP_NAMES = getResources().getStringArray(R.array.APP_NAMES);
final int[] myAPP_SIZES = getResources().getIntArray(R.array.APP_SIZES);
setContentView(R.layout.video);
getWindow().setFormat(PixelFormat.TRANSLUCENT);
final VideoView videoView = (VideoView) findViewById(R.id.videoView1);
final MediaController mediaController = new MediaController(this);
mediaController.setAnchorView(videoView);
Bundle extras = getIntent().getExtras();
url = myAPP_FILES[extras.getInt("key") ];
fName = myAPP_FILENAMES[extras.getInt("key") ];
vidName = myAPP_NAMES[extras.getInt("key") ];
int fsize = (myAPP_SIZES[extras.getInt("key") ] -1 )*1000;
File file1 = new File(Environment.getExternalStorageDirectory(), fName );
if (file1.exists()) {
if(file1.length() < fsize) {
file1.delete();
}
}
loadMedia();
Toast.makeText(
getApplicationContext(),
"" + file1.length()+ " " + fsize,
Toast.LENGTH_LONG).show();
String pathfile = Environment.getExternalStorageDirectory() + "/" +fName;
try {
Uri video = Uri.parse(pathfile);
videoView.setMediaController(mediaController);
videoView.setVideoURI(video);
videoView.start();
} catch (Exception w) {}
}
private void loadMedia() {
//Check for media file download if not on sdcard
File file = new File(Environment.getExternalStorageDirectory(), fName );
if (!file.exists()) {
new DownloadFileAsync().execute(url);
}
}
#Override
protected Dialog onCreateDialog(int id) {
switch (id) {
case DIALOG_DOWNLOAD_PROGRESS:
mProgressDialog = new ProgressDialog(this);
mProgressDialog.setMessage("Downloading to SD: " + vidName + "\n...Please allow download to finish completely...");
mProgressDialog.setProgressStyle(ProgressDialog.STYLE_HORIZONTAL);
mProgressDialog.setCancelable(true);
mProgressDialog.show();
return mProgressDialog;
default:
return null;
}
}
class DownloadFileAsync extends AsyncTask<String, String, String> {
#Override
protected void onPreExecute() {
super.onPreExecute();
showDialog(DIALOG_DOWNLOAD_PROGRESS);
}
#Override
protected String doInBackground(String... aurl) {
int count;
try {
URL url = new URL(aurl[0]);
URLConnection conexion = url.openConnection();
conexion.connect();
int lengthOfFile = conexion.getContentLength();
Log.d("ANDRO_ASYNC", "Length of file: " + lengthOfFile);
InputStream input = new BufferedInputStream(url.openStream(), 1024);
OutputStream output = new FileOutputStream("/sdcard/" + fName);
byte data[] = new byte[1024];
long total = 0;
while ((count = input.read(data)) != -1) {
total += count;
publishProgress(""+(int)((total*100)/lengthOfFile));
output.write(data, 0, count);
}
output.flush();
output.close();
input.close();
} catch (Exception e) {}
return null;
}
protected void onProgressUpdate(String... progress) {
Log.d("ANDRO_ASYNC",progress[0]);
mProgressDialog.setProgress(Integer.parseInt(progress[0]));
}
#Override
protected void onPostExecute(String unused) {
dismissDialog(DIALOG_DOWNLOAD_PROGRESS);
}
}
}
In operation a spinner is displayed and the user can select a video. It then checks to see if the file exists on the SD card if it is not it starts the download and a progress bar is displayed. Here is the problem At that time a message pops up that says:
CANNOT PLAY VIDEO
Sorry, this video cannot be played.
the download is progressing in the background and I can see the progress bar but it is darkened down
I do not want this message to appear.
I want to catch this and do nothing so the message will not appear
In the code I process the loadmedia function and then I set the video to play. I need to try and catch this step looking for this error message but I do not know what to look for
in my logcat I see this
02-02 09:35:29.257: W/MediaPlayer(13311): info/warning (1, 26)
02-02 09:35:29.257: E/MediaPlayer(13311): error (1, -4)
02-02 09:35:29.277: I/MediaPlayer(13311): Info (1,26)
02-02 09:35:29.277: E/MediaPlayer(13311): Error (1,-4)
02-02 09:35:29.277: D/VideoView(13311): Error: 1,-4
I think it is included here but I don't know how to translate this into a valid try and catch routine
Hopefully someone can help
To catch an exception an exception must be thrown. Looks like you're already catching the base level Exception which would catch any uncaught exception from the media player layer. Therefore, they're not throwing any that they aren't catching themselves.
So this isn't about catching exceptions at this point but registering to receive notification of errors that the framework provides. Looking here shows a way to register a listener for errors. Whether that means that the error won't show up as you've described is unknown. I suspect it'll still show. But you may have no control over that.

Playing audio is too slow in Android

I'm having a problem with Android's MediaPlayer in that it is too slow when calling the prepare method. I've tried to simply keep a Vector of the few MediaPlayer objects (with their preloaded data sources) but calling .start() multiple times results in weird issues.
The first issue is it will skip every other play, and sometimes the play will be half (or less) as loud.
The tones played are very very short but need to be played as quickly as possible. My source code is posted below.
Any help is greatly appreciated.
Kevin
package com.atClass.lemon;
import java.util.Vector;
import com.atClass.cardShoe.SettingTools.SETTING_PREF;
import com.atClass.cardShoe.SettingTools.SETTING_STUB;
import com.atClass.cardShoe.SettingTools.SETTING_VALUE;
import android.content.res.AssetFileDescriptor;
import android.media.MediaPlayer;
import android.media.MediaPlayer.OnCompletionListener;
import android.net.Uri;
import android.util.Config;
import android.util.Log;
public class MediaHandler {
public static int cRepeat;
public static float cVolume = Integer.valueOf(Prefs.cPrefsGet.getString(SETTING_PREF.annunciator_volume.name()+SETTING_STUB._int.name(), PrefDefaults.getDefault(SETTING_PREF.annunciator_volume,SETTING_STUB._int)));
public static boolean cVolumeEnabled = !(Prefs.cPrefsGet.getString(SETTING_PREF.annunciator_volume.name()+SETTING_STUB._value.name(),PrefDefaults.getDefault(SETTING_PREF.annunciator_volume)).equals(SETTING_VALUE.disabled.name()));
static Vector <MediaPlayer> cQuickMediaPlayerList = new Vector<MediaPlayer>();
public static enum AUDIO_CLIP {
app_boot_sound(R.raw.windows_hardware_insert),
app_results_sound(R.raw.windows_exclamation),
app_warning_sound(R.raw.windows_hardware_fail),
app_card_draw_sound(R.raw.fs_beep5),
app_lid_open_sound(R.raw.windows_hardware_fail),
app_top_tigger_overdraw_sound(R.raw.fs_beep6),
test(R.raw.fs_beep4);
private int enumResourceId;
AUDIO_CLIP(int input){ enumResourceId = input;}
int getItem(){return enumResourceId;}
}
public static int getAudioClipIndex(AUDIO_CLIP iAudioClip){
for (int i=0; i<AUDIO_CLIP.values().length; i++){
if (AUDIO_CLIP.values()[i] == iAudioClip){
return i;
}
}
return 0;
}
public static void setupQuickMediaPlayer(){
cQuickMediaPlayerList.clear();
for (int i=0; i<AUDIO_CLIP.values().length; i++){
MediaPlayer lMediaPlayer = new MediaPlayer();
final AssetFileDescriptor afd = Global.gContext.getResources().openRawResourceFd(AUDIO_CLIP.values()[i].getItem());
try{
lMediaPlayer.setDataSource(afd.getFileDescriptor(), afd.getStartOffset(), afd.getLength());
afd.close();
lMediaPlayer.prepare();
}catch(Exception e){}
lMediaPlayer.setVolume(cVolume,cVolume);
lMediaPlayer.setLooping(false);
lMediaPlayer.setOnCompletionListener(new OnCompletionListener(){
#Override
public void onCompletion(MediaPlayer lMediaPlayer) {
lMediaPlayer.release();
try{lMediaPlayer.prepare();}catch(Exception e){e.printStackTrace();}
}});
cQuickMediaPlayerList.add(lMediaPlayer);
}
}
public static void playAudio(AUDIO_CLIP iAudioClip){
float volume = cVolume;
volume++;
volume /= 10;
playAudio(iAudioClip,volume);
}
public static void playAudio(final AUDIO_CLIP iAudioClip, final float iVolume){
Thread lThread = new Thread(new Runnable(){
public void run() {
//int resourceId = iAudioClip.getItem();
Log.d(Global.TAG,"--> Playing audio clip: " + iAudioClip.name() + "," + iAudioClip.getItem() + "," + getAudioClipIndex(iAudioClip));
if (cVolumeEnabled == true){
//Log.d(Global.TAG,"--> Supplying volume: " + iVolume);
//Works but is too slow
// try {
// final MediaPlayer lMediaPlayer = new MediaPlayer();
// AssetFileDescriptor afd = Global.gContext.getResources().openRawResourceFd(iAudioClip.getItem());
// lMediaPlayer.setDataSource(afd.getFileDescriptor(), afd.getStartOffset(), afd.getLength());
// afd.close();
// lMediaPlayer.prepare();
// lMediaPlayer.setVolume(iVolume,iVolume);
// lMediaPlayer.setLooping(false);
// lMediaPlayer.setOnCompletionListener(new OnCompletionListener(){
// #Override
// public void onCompletion(MediaPlayer arg0) {
// lMediaPlayer.release();
// }});
// lMediaPlayer.start();
// }catch(Exception e){}
try{
//Works half the time
cQuickMediaPlayerList.get(getAudioClipIndex(iAudioClip)).start();
}catch(Exception e){}
}
}
});
lThread.setPriority(Thread.MAX_PRIORITY);
lThread.start();
}
}
You should use SoundPool instead: http://developer.android.com/reference/android/media/SoundPool.html
In your onCompletionListener, you call release(), followed by prepare(). This is illegal, and is probably why you're having problems starting them multiple times. If you want to call it again, don't use release(), because that frees all the resources for the MP, and should only be called when you are done with it. Use stop() instead.
However, this still won't speed up the prepare(). You might want to try seekTo(0) instead, but even then it might not be as fast as you want. It really depends on how fast you're talking about.

Categories

Resources