Below is my code;
Can I create mp3 to .srt file programmatically?
holder.llView.setOnClickListener(new View.OnClickListener() {
#TargetApi(Build.VERSION_CODES.JELLY_BEAN)
#Override
public void onClick(View v) {
MediaPlayer mediaPlayer = new MediaPlayer();
try {
mediaPlayer.setDataSource(filePathList.get(position));
mediaPlayer.prepare();
//mediaPlayer.addTimedTextSource(subTitleSrc, MediaPlayer.MEDIA_MIMETYPE_TEXT_SUBRIP);
//int textTrackIndex = findTrackIndexFor(
//MediaPlayer.TrackInfo.MEDIA_TRACK_TYPE_TIMEDTEXT,
//mediaPlayer.getTrackInfo());
//if (textTrackIndex >= 0) {
// mediaPlayer.selectTrack(textTrackIndex);
//} else {
// Log.w("test", "Cannot find text track!");
//}
// if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.JELLY_BEAN) {
// mediaPlayer.setOnTimedTextListener(new MediaPlayer.OnTimedTextListener() {
//#Override
//public void onTimedText(final MediaPlayer mediaPlayer, final TimedText timedText) {
//if (timedText != null) {
//Log.d("test", "subtitle: " + timedText.getText());
// }
// }
// });
// }
mediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
} catch (IOException e) {
e.printStackTrace();
}
mediaPlayer.start();
}
});
Below code for ffmpeg command;
String cmd= "ffmpeg -i"+ filePathList.get(position).substring(filePathList.get(position).lastIndexOf("/")+1);
FFmpeg ffmpeg = FFmpeg.getInstance(CountryListActivity.this);
try {
// to execute "ffmpeg -version" command you just need to pass "-version"
ffmpeg.execute(new String[]{cmd}, new ExecuteBinaryResponseHandler() {
#Override
public void onStart() {
Log.e("!!!!!!!!!",""+filePathList.get(position));
}
#Override
public void onProgress(String message) {
Log.e("message..............",""+message);
}
#Override
public void onFailure(String message) {
Log.e("messageeeeeeeee..............",""+message);
}
#Override
public void onSuccess(String message) {
Log.e("messageeeeeeewwwwwwwwwwwee..............",""+message);
}
#Override
public void onFinish() {
Log.e("messagewwwewreeeeeeee..............","");
}
});
} catch (FFmpegCommandAlreadyRunningException e) {
// Handle if FFmpeg is already running
e.printStackTrace();
}
What you are basically looking for it Speech To Text. As you've mentioned in your Question that you already have recorded audio. You can use Google's Speech To Text API to get Text from Audio. Here is a sample code from Google
// Imports the Google Cloud client library
import com.google.cloud.speech.v1.RecognitionAudio;
import com.google.cloud.speech.v1.RecognitionConfig;
import com.google.cloud.speech.v1.RecognitionConfig.AudioEncoding;
import com.google.cloud.speech.v1.RecognizeResponse;
import com.google.cloud.speech.v1.SpeechClient;
import com.google.cloud.speech.v1.SpeechRecognitionAlternative;
import com.google.cloud.speech.v1.SpeechRecognitionResult;
import com.google.protobuf.ByteString;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.Paths;
import java.util.List;
public class QuickstartSample {
/**
* Demonstrates using the Speech API to transcribe an audio file.
*/
public static void main(String... args) throws Exception {
// Instantiates a client
try (SpeechClient speechClient = SpeechClient.create()) {
// The path to the audio file to transcribe
String fileName = "./resources/audio.raw";
// Reads the audio file into memory
Path path = Paths.get(fileName);
byte[] data = Files.readAllBytes(path);
ByteString audioBytes = ByteString.copyFrom(data);
// Builds the sync recognize request
RecognitionConfig config = RecognitionConfig.newBuilder()
.setEncoding(AudioEncoding.LINEAR16)
.setSampleRateHertz(16000)
.setLanguageCode("en-US")
.build();
RecognitionAudio audio = RecognitionAudio.newBuilder()
.setContent(audioBytes)
.build();
// Performs speech recognition on the audio file
RecognizeResponse response = speechClient.recognize(config, audio);
List<SpeechRecognitionResult> results = response.getResultsList();
for (SpeechRecognitionResult result : results) {
// There can be several alternative transcripts for a given chunk of speech. Just use the
// first (most likely) one here.
SpeechRecognitionAlternative alternative = result.getAlternativesList().get(0);
System.out.printf("Transcription: %s%n", alternative.getTranscript());
}
}
}
}
In order to add it to your Android Project. For adding GoogleCloud you can visit or you can directly download JAR
Related
While using MediaRecorder, we don't have pause/resume for API level below 24.
So there can be a way to do this is:
On pause event stop the recorder and create the recorded file.
And on resume start recording again and create another file and keep doing so until user presses stop.
And at last merge all files.
Many people asked this question on SO, but couldn't find anyway to solve this. People talk about creating multiple media files by stopping recording on pause action and restarting on resume. So my question is How can we merge/join all media file programmatically?
Note: in my case MPEG4 container - m4a for audio and mp4 for video.
I tried using SequenceInputStream to merge multiple InputStream of respective generated recorded files. But it always results the first file only.
Code Snippet:
Enumeration<InputStream> enu = Collections.enumeration(inputStreams);
SequenceInputStream sqStream = new SequenceInputStream(enu);
while ((oneByte = sqStream.read(buffer)) != -1) {
fileOutputStream.write(buffer, 0, oneByte);
}
sqStream.close();
while (enu.hasMoreElements()) {
InputStream element = enu.nextElement();
element.close();
}
fileOutputStream.flush();
fileOutputStream.close();
I could solve this problem using mp4parser library. Thanks much to author of this library :)
Add below dependency in your gradle file:
compile 'com.googlecode.mp4parser:isoparser:1.0.2'
The solution is to stop recorder when user pause and start again on resume as already mentioned in many other answers in stackoverflow. Store all the audio/video files generated in an array and use below method to merge all media files. The example is also taken from mp4parser library and modified little bit as per my need.
public static boolean mergeMediaFiles(boolean isAudio, String sourceFiles[], String targetFile) {
try {
String mediaKey = isAudio ? "soun" : "vide";
List<Movie> listMovies = new ArrayList<>();
for (String filename : sourceFiles) {
listMovies.add(MovieCreator.build(filename));
}
List<Track> listTracks = new LinkedList<>();
for (Movie movie : listMovies) {
for (Track track : movie.getTracks()) {
if (track.getHandler().equals(mediaKey)) {
listTracks.add(track);
}
}
}
Movie outputMovie = new Movie();
if (!listTracks.isEmpty()) {
outputMovie.addTrack(new AppendTrack(listTracks.toArray(new Track[listTracks.size()])));
}
Container container = new DefaultMp4Builder().build(outputMovie);
FileChannel fileChannel = new RandomAccessFile(String.format(targetFile), "rw").getChannel();
container.writeContainer(fileChannel);
fileChannel.close();
return true;
}
catch (IOException e) {
Log.e(LOG_TAG, "Error merging media files. exception: "+e.getMessage());
return false;
}
}
Use flag isAudio as true for Audio files and false for Video files.
Another solution is merging with FFmpeg
Add this line to your app build.gradle
implementation 'com.writingminds:FFmpegAndroid:0.3.2'
And use below code to merge videos.
String textFile = "";
try {
textFile = getTextFile().getAbsolutePath();
} catch (IOException e) {
e.printStackTrace();
}
String[] cmd = new String[]{
"-y",
"-f",
"concat",
"-safe",
"0",
"-i",
textFile,
"-c",
"copy",
"-preset",
"ultrafast",
getVideoFilePath()};
mergeVideos(cmd);
getTextFile()
private File getTextFile() throws IOException {
videoFiles = new String[]{firstPath, secondPath, thirdPatch};
File file = new File(getActivity().getExternalFilesDir(null), System.currentTimeMillis() + "inputFiles.txt");
FileOutputStream out = new FileOutputStream(file, false);
PrintWriter writer = new PrintWriter(out);
StringBuilder builder = new StringBuilder();
for (String path : videoFiles) {
if (path != null) {
builder.append("file ");
builder.append("\'");
builder.append(path);
builder.append("\'\n");
}
}
builder.deleteCharAt(builder.length() - 1);
String text = builder.toString();
writer.print(text);
writer.close();
out.close();
return file;
}
getVideoFilePath()
private String getVideoFilePath() {
final File dir = getActivity().getExternalFilesDir(null);
return (dir == null ? "" : (dir.getAbsolutePath() + "/"))
+ System.currentTimeMillis() + ".mp4";
}
mergeVideos()
private void mergeVideos(String[] cmd) {
FFmpeg ffmpeg = FFmpeg.getInstance(getActivity());
try {
ffmpeg.execute(cmd, new ExecuteBinaryResponseHandler() {
#Override
public void onStart() {
startTime = System.currentTimeMillis();
}
#Override
public void onProgress(String message) {
}
#Override
public void onFailure(String message) {
Toast.makeText(getActivity(), "Failed " + message, Toast.LENGTH_SHORT).show();
}
#Override
public void onSuccess(String message) {
}
#Override
public void onFinish() {
Toast.makeText(getActivity(), "Videos are merged", Toast.LENGTH_SHORT).show();
}
});
} catch (FFmpegCommandAlreadyRunningException e) {
// Handle if FFmpeg is already running
}
}
Run this code before merging
private void checkFfmpegSupport() {
FFmpeg ffmpeg = FFmpeg.getInstance(this);
try {
ffmpeg.loadBinary(new LoadBinaryResponseHandler() {
#Override
public void onStart() {
}
#Override
public void onFailure() {
Toast.makeText(VouchActivity.this, "FFmpeg not supported on this device :(", Toast.LENGTH_SHORT).show();
}
#Override
public void onSuccess() {
}
#Override
public void onFinish() {
}
});
} catch (FFmpegNotSupportedException e) {
// Handle if FFmpeg is not supported by device
}
}
I am using the drive api to create a database file in the hidden app folder on google drive. The database file is called notes.db I have been able to successfully upload the database file to google drive but I have no idea how to download it back to the user's device. This is what i'm trying to do. My app makes a folder on the user's device called School Binder. in that folder is another folder called Note backups. Here is where I backup the database. The directory is
Environment.getExternalStorageDirectory() + "/School Binder/Note Backups/Notes.db"
Google drive takes this file and uploads it to the hidden app folder. Now I want to get this notes.db file stored in that app folder on google drive and download it back to this directory on the phone.
Environment.getExternalStorageDirectory() + "/School Binder/Note Backups/Notes.db"
How do I do this. Thanks. Here is my code for uploading the database to drive this works correctly
// Define And Instantiate Variable DriveContents driveContents//
DriveContents driveContents = result.getStatus().isSuccess() ? result.getDriveContents() : null;
// Gets The Data for The File//
if (driveContents != null) try {
// Define And Instantiate Variable OutputStream outputStream//
OutputStream outputStream = driveContents.getOutputStream();
// Start Writing Data To File//
if (outputStream != null) try {
// Define And Instantiate Variable InputStream inputStream//
InputStream inputStream = new FileInputStream(dbFile);
// Define And Instantiate Variable Byte buffer//
byte[] buffer = new byte[5000];
// Define Variable Int data//
int data;
// Run Code While data Is Bigger Then Zero//
while ((data = inputStream.read(buffer, 0, buffer.length)) > 0) {
// Write To outputStream//
outputStream.write(buffer, 0, data);
// Flush outputStream//
outputStream.flush();
}
} finally {
// Close outputStream//
outputStream.close();
}
} catch (Exception e) {e.printStackTrace(); Toast.makeText(getApplicationContext(), "Failed To Upload: No Backup File Found", Toast.LENGTH_LONG).show(); return;}
How do I change this to make it work for downloading data to a file from google drive
In Lifecycle of a Drive file, Drive Android API lets your app access files even if the device is offline. To support offline cases, the API implements a sync engine, which runs in the background to upstream and downstream changes as network access is available and to resolve conflicts. Perform an initial download request if the file is not yet synced to the local context but the user wants to open the file. The API handles this automatically when a file is requested.
In downloading a file, you make an authorized HTTP GET request to the file's resource URL and include the query parameter alt=media. However, please note that downloading the file requests the user to have at least read access.
Sample HTTP Request:
GET https://www.googleapis.com/drive/v3/files/0B9jNhSvVjoIVM3dKcGRKRmVIOVU?alt=media
Authorization: Bearer ya29.AHESVbXTUv5mHMo3RYfmS1YJonjzzdTOFZwvyOAUVhrs
For the coding part, this SO post might be of help too.
I figured it out this is my code to redownload a database back to the phone
//<editor-fold desc="Create Drive Db File On Device">
// Log That The File Was Opened//
Log.d("TAG", "File contents opened");
// Define And Instantiate Variable DriveContents driveContents//
DriveContents driveContents = result.getStatus().isSuccess() ? result.getDriveContents() : null;
// Gets The Data for The File//
if (driveContents != null) try {
// Define And Instantiate Variable OutputStream outputStream//
OutputStream outputStream = new FileOutputStream(dbFile);
// Define And Instantiate Variable InputStream inputStream//
InputStream inputStream = driveContents.getInputStream();
// Define And Instantiate Variable Byte buffer//
byte[] buffer = new byte[5000];
// Define Variable Int data//
int data;
// Run Code While data Is Bigger Then Zero//
while ((data = inputStream.read(buffer, 0, buffer.length)) > 0) {
// Write To outputStream//
outputStream.write(buffer, 0, data);
// Flush outputStream//
outputStream.flush();
}
// Close outputStream//
outputStream.close();
// Discard Drive Contents//
driveContents.discard(googleApiClient);
} catch (Exception e) {e.printStackTrace(); Toast.makeText(getApplicationContext(), "File Failed To Download", Toast.LENGTH_LONG).show(); }
//</editor-fold>
here is a complete class to upload an internal database, download it and delete it from Google Drive.
Only need to call functions asynchronously and show user a progressbar.
DownloadFromGoogleDrive function saves the database in the internal database folder to the app with the name "database2"
Hope it's helpful.
import java.io.BufferedInputStream;
import java.io.BufferedOutputStream;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
import com.google.android.gms.common.ConnectionResult;
import com.google.android.gms.common.api.GoogleApiClient;
import com.google.android.gms.common.api.GoogleApiClient.ConnectionCallbacks;
import com.google.android.gms.common.api.GoogleApiClient.OnConnectionFailedListener;
import com.google.android.gms.common.api.ResultCallback;
import com.google.android.gms.common.api.Status;
import com.google.android.gms.drive.Drive;
import com.google.android.gms.drive.DriveApi;
import com.google.android.gms.drive.DriveApi.MetadataBufferResult;
import com.google.android.gms.drive.DriveFile.DownloadProgressListener;
import com.google.android.gms.drive.DriveId;
import com.google.android.gms.drive.DriveResource;
import com.google.android.gms.drive.Metadata;
import com.google.android.gms.drive.DriveApi.DriveContentsResult;
import com.google.android.gms.drive.DriveContents;
import com.google.android.gms.drive.DriveFile;
import com.google.android.gms.drive.DriveFolder.DriveFileResult;
import com.google.android.gms.drive.MetadataChangeSet;
import com.google.android.gms.drive.query.Filters;
import com.google.android.gms.drive.query.Query;
import com.google.android.gms.drive.query.SearchableField;
import android.app.Activity;
import android.content.Intent;
import android.content.IntentSender.SendIntentException;
import android.os.Bundle;
import android.util.Log;
import android.webkit.MimeTypeMap;
import android.widget.Toast;
public class BackupDatabaseActivity extends Activity implements ConnectionCallbacks, OnConnectionFailedListener {
private static final String TAG = "BackupDatabaseActivity";
private GoogleApiClient api;
private boolean mResolvingError = false;
private static final int DIALOG_ERROR_CODE =100;
private static final String DATABASE_NAME = "database";
private static final String GOOGLE_DRIVE_FILE_NAME = "database_backup";
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
// Create the Drive API instance
api = new GoogleApiClient.Builder(this).addApi(Drive.API).addScope(Drive.SCOPE_FILE).
addConnectionCallbacks(this).addOnConnectionFailedListener(this).build();
}
final private ResultCallback<DriveApi.DriveContentsResult> contentsCallback = new ResultCallback<DriveApi.DriveContentsResult>() {
#Override
public void onResult(DriveApi.DriveContentsResult result) {
if (!result.getStatus().isSuccess()) {
Log.v(TAG, "Error while trying to create new file contents");
return;
}
CreateFileOnGoogleDrive(result);
//OR DownloadFromGoogleDrive(result);
//OR DeleteFromGoogleDrive(result);
}
};
final private ResultCallback<DriveFileResult> fileCallback = new ResultCallback<DriveFileResult>() {
#Override
public void onResult(DriveFileResult result) {
if (!result.getStatus().isSuccess()) {
Log.v(TAG, "Error while trying to create the file");
return;
}
Log.v(TAG, "File created: "+result.getDriveFile().getDriveId());
}
};
/**
* Create a file in root folder using MetadataChangeSet object.
* #param result
*/
public void CreateFileOnGoogleDrive(DriveContentsResult result){
final DriveContents driveContents = result.getDriveContents();
// Perform I/O off the UI thread.
new Thread() {
#Override
public void run() {
try {
FileInputStream is = new FileInputStream(getDbPath());
BufferedInputStream in = new BufferedInputStream(is);
byte[] buffer = new byte[8 * 1024];
BufferedOutputStream out = new BufferedOutputStream(driveContents.getOutputStream());
int n = 0;
while( ( n = in.read(buffer) ) > 0 ) {
out.write(buffer, 0, n);
}
out.flush();
out.close();
in.close();
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
String mimeType = MimeTypeMap.getSingleton().getExtensionFromMimeType("db");
MetadataChangeSet changeSet = new MetadataChangeSet.Builder()
.setTitle(GOOGLE_DRIVE_FILE_NAME) // Google Drive File name
.setMimeType(mimeType)
.setStarred(true).build();
// create a file in root folder
Drive.DriveApi.getRootFolder(api)
.createFile(api, changeSet, driveContents)
.setResultCallback(fileCallback);
}
}.start();
}
/**
* Download File from Google Drive
* #param result
*/
public void DownloadFromGoogleDrive(DriveContentsResult result){
final DriveContents driveContents = result.getStatus().isSuccess() ? result.getDriveContents() : null;
if(driveContents!=null){
Query query = new Query.Builder().addFilter(Filters.eq(SearchableField.TITLE, GOOGLE_DRIVE_FILE_NAME)).build();
Drive.DriveApi.query(api, query).setResultCallback(new ResultCallback<MetadataBufferResult>() {
#Override
public void onResult(MetadataBufferResult result) {
try{
DriveId driveId = result.getMetadataBuffer().get(0).getDriveId();
DriveFile driveFile = driveId.asDriveFile();
//mProgressBar.setProgress(0);
DownloadProgressListener listener = new DownloadProgressListener() {
#Override
public void onProgress(long bytesDownloaded, long bytesExpected) {
// Update progress dialog with the latest progress.
int progress = (int)(bytesDownloaded*100/bytesExpected);
Log.d(TAG, String.format("Loading progress: %d percent", progress));
// mProgressBar.setProgress(progress);
}
};
driveFile.open(api, DriveFile.MODE_READ_ONLY, listener).setResultCallback(driveContentsCallback);
}catch(Exception e){
Toast.makeText(getApplicationContext(), "File Failed To Download", Toast.LENGTH_LONG).show();
}
}
});
}else{
Toast.makeText(getApplicationContext(), "File Failed To Download", Toast.LENGTH_LONG).show();
}
}
private ResultCallback<DriveContentsResult> driveContentsCallback =
new ResultCallback<DriveContentsResult>() {
#Override
public void onResult(DriveContentsResult result) {
if (!result.getStatus().isSuccess()) {
Log.d(TAG, "Error while opening the file contents");
return;
}
Log.d(TAG, "Downloaded");
DriveContents dc = result.getDriveContents();
try {
InputStream inputStream = dc.getInputStream();
OutputStream outputStream = new FileOutputStream(getDbPath()+"2");
byte[] buffer = new byte[8 * 1024];
//BufferedOutputStream out = new BufferedOutputStream(dc.getOutputStream());
int n = 0;
while( ( n = inputStream.read(buffer) ) > 0 ) {
outputStream.write(buffer, 0, n);
}
outputStream.flush();
outputStream .close();
//inputStream.close();
dc.discard(api);
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
};
/**
* Delete File from Google Drive
* #param result
*/
public void DeleteFromGoogleDrive(DriveContentsResult result){
Query query = new Query.Builder()
.addFilter(Filters.eq(SearchableField.TITLE, GOOGLE_DRIVE_FILE_NAME))
.build();
Drive.DriveApi.query(api, query)
.setResultCallback(new ResultCallback<MetadataBufferResult>() {
#Override
public void onResult(MetadataBufferResult result) {
try{
Metadata metadata = result.getMetadataBuffer().get(0);
/*String a = metadata.getTitle();
String b = metadata.getDescription();
long c = metadata.getFileSize();*/
DriveResource driveResource = metadata.getDriveId().asDriveResource();
if (metadata.isTrashable()) {
if (metadata.isTrashed()) {
driveResource.untrash(api).setResultCallback(trashStatusCallback);
} else {
driveResource.trash(api).setResultCallback(trashStatusCallback);
}
} else {
Log.d(TAG, "Error trying delete");
}
}catch(Exception e){
Log.d(TAG, "Error: metadata doesn't exist");
}
}
});
}
/**
* Callback when call to trash or untrash is complete.
*/
private final ResultCallback<Status> trashStatusCallback =
new ResultCallback<Status>() {
#Override
public void onResult(Status status) {
if (!status.isSuccess()) {
Log.e(TAG, "Error trying delete: " + status.getStatusMessage());
return;
}else{
Log.e(TAG, "Deleted: " + status.getStatusMessage());
}
}
};
private File getDbPath() {
return this.getDatabasePath(DATABASE_NAME);
}
#Override
public void onConnectionSuspended(int cause) {
// TODO Auto-generated method stub
Log.v(TAG, "Connection suspended");
}
#Override
public void onStart() {
super.onStart();
if(!mResolvingError) {
api.connect(); // Connect the client to Google Drive
}
}
#Override
public void onStop() {
super.onStop();
api.disconnect(); // Disconnect the client from Google Drive
}
#Override
public void onConnectionFailed(ConnectionResult result) {
Log.v(TAG, "Connection failed");
if(mResolvingError) { // If already in resolution state, just return.
return;
} else if(result.hasResolution()) { // Error can be resolved by starting an intent with user interaction
mResolvingError = true;
try {
result.startResolutionForResult(this, DIALOG_ERROR_CODE);
} catch (SendIntentException e) {
e.printStackTrace();
}
} else { // Error cannot be resolved. Display Error Dialog stating the reason if possible.
Toast.makeText(this, "Error: Connection failed", Toast.LENGTH_SHORT).show();
}
}
#Override
public void onConnected(Bundle connectionHint) {
Log.v(TAG, "Connected successfully");
/* Connection to Google Drive established. Now request for Contents instance, which can be used to provide file contents.
The callback is registered for the same. */
Drive.DriveApi.newDriveContents(api).setResultCallback(contentsCallback);
}
#Override
public void onActivityResult(int requestCode, int resultCode, Intent data) {
if(requestCode == DIALOG_ERROR_CODE) {
mResolvingError = false;
if(resultCode == RESULT_OK) { // Error was resolved, now connect to the client if not done so.
if(!api.isConnecting() && !api.isConnected()) {
api.connect();
}
}
}
}
}
I would like to write WCF Serwis where I use Microsotf.Speech.Recognition library to make speech to text service. Here is my Service code:
public class Rozpoznawacz : IRozpoznawacz
{
public void AudioToText(Stream audioStr)
{
SpeechRecognitionEngine _sre = new SpeechRecognitionEngine(new System.Globalization.CultureInfo("pl-PL"));
// Create a simple grammar that recognizes the words
Choices words = new Choices();
// Add the words to be recognised
words.Add("red");
words.Add("green");
words.Add("blue");
words.Add("yellow");
words.Add("orange");
words.Add("Dzień dobry");
words.Add("Chrząszcz");
words.Add("Brzmi");
words.Add("w");
words.Add("trzcinie");
words.Add("Wystaw fakturę");
words.Add("Stefan Burczymucha");
GrammarBuilder gb = new GrammarBuilder();
gb.Culture = new System.Globalization.CultureInfo("pl-PL");
gb.Append(words);
// Create the actual Grammar instance, and then load it into the speech recognizer.
Grammar g = new Grammar(gb);
_sre.LoadGrammar(g);
// Register a handler for the SpeechRecognized event.
_sre.SpeechRecognized +=
new EventHandler<SpeechRecognizedEventArgs>(sre_SpeechRecognized);
//_sre.SetInputToDefaultAudioDevice();
_sre.SetInputToWaveStream(audioStr);
_sre.RecognizeAsync(RecognizeMode.Multiple);
}
void sre_SpeechRecognized(object sender, SpeechRecognizedEventArgs e)
{
string rozpoznanie = "";
rozpoznanie += e.Result.Text;
using (StreamWriter outfile = new StreamWriter(#"C:\Test.txt"))
{
outfile.Write(rozpoznanie);
}
}
public string Test(string query)
{
return string.Format("Przyjęto: {0}", query);
}
}
Next, I tried to write Android App with service references, I tried to record voice and send to webService host, but I don't know how I should send audio file. Here is my not working code in android app:
using System;
using Android.App;
using Android.Content;
using Android.Runtime;
using Android.Views;
using Android.Widget;
using Android.OS;
using Android.Media;
using System.IO;
namespace RozpoznawanieMowyZdalne.Adroid
{
[Activity(Label = "RozpoznawanieMowyZdalne.Adroid", MainLauncher = true, Icon = "#drawable/icon")]
public class MainActivity : Activity
{
int count = 1;
MediaRecorder recorder;
MediaPlayer player;
Button btnStart;
Button btnStop;
string path = "/sdcard/test.3gpp";
private RozpoznawaczService.Rozpoznawacz client;
private TextView aLabel;
byte[] audioByte;
protected override void OnCreate(Bundle bundle)
{
base.OnCreate(bundle);
// Set our view from the "main" layout resource
SetContentView(Resource.Layout.Main);
// Get our button from the layout resource,
// and attach an event to it
btnStart = FindViewById<Button>(Resource.Id.btnStart);
btnStop = FindViewById<Button>(Resource.Id.btnStop);
btnStart.Click += delegate
{
client.TestAsync("Android");
btnStop.Enabled = !btnStop.Enabled;
btnStart.Enabled = !btnStart.Enabled;
recorder.SetAudioSource(AudioSource.VoiceRecognition);
recorder.SetOutputFormat(OutputFormat.ThreeGpp);
recorder.SetAudioEncoder(AudioEncoder.AmrNb);
recorder.SetOutputFile(path);
recorder.Prepare();
recorder.Start();
Toast.MakeText(this, "Rozpoczęto nagrywanie", ToastLength.Long).Show();
};
btnStop.Click += delegate
{
btnStop.Enabled = !btnStop.Enabled;
recorder.Stop();
Toast.MakeText(this, "Zakończono nagrywanie", ToastLength.Long).Show();
recorder.Reset();
player.SetDataSource(path);
player.Prepare();
player.Start();
File.WriteAllBytes(path, audioByte);
client.AudioToTextAsync(audioByte);
};
InitializeServiceClient();
}
protected override void OnResume()
{
base.OnResume();
recorder = new MediaRecorder();
player = new MediaPlayer();
player.Completion += (sender, e) =>
{
player.Reset();
btnStart.Enabled = !btnStart.Enabled;
};
}
protected override void OnPause()
{
base.OnPause();
player.Release();
recorder.Release();
player.Dispose();
recorder.Dispose();
player = null;
recorder = null;
}
private void InitializeServiceClient()
{
client = new RozpoznawaczService.Rozpoznawacz();
client.TestCompleted += client_TestCompleted;
client.AudioToTextCompleted += client_AudioToTextCompleted;
aLabel = FindViewById<TextView>(Resource.Id.textViewTest);
}
void client_AudioToTextCompleted(object sender, System.ComponentModel.AsyncCompletedEventArgs e)
{
string msg = null;
if (e.Error != null)
{
msg = e.Error.Message;
}
else if (e.Cancelled)
{
msg = "Request was cancelled.";
}
else
{
//msg = e.Result;
}
RunOnUiThread(() => aLabel.Text = "Wyslane");
}
void client_TestCompleted(object sender, RozpoznawaczService.TestCompletedEventArgs e)
{
string msg = null;
if (e.Error != null)
{
msg = e.Error.Message;
}
else if (e.Cancelled)
{
msg = "Request was cancelled.";
}
else
{
msg = e.Result;
}
RunOnUiThread(() => aLabel.Text = msg);
}
}
}
How can I send my audio file to my webService?
PS. File.WriteAllBytes(path, audioByte); - it doesn't work in Android app...
I like the Android Soundpool class for its simplicity and it works well with the standard audio files I am using in my app. Now I want to make it possible for the user to specify certains sounds by specifying audio files on the sd card. Unfortunately I run into limitations of Soundpool, when the sound file is too big i get a
AudioFlinger could not create track. status: -12
response. It seems I have to switch to MediaPlayer yet before getting into the complexity of MediaPlayer again I wanted to ask if there is an audio library available for android which
has the simplicity of Soundpool for playing various sounds
doesnt have the limitations of Soundpool regarding the size of the files.
Thank you very much.
martin
For now I came up with a very simple AudioPool class which plays audio added to it subsequently with the MediaPlayer class. This implementation is for sure not mature yet I just thought to share it as it at least gives some idea how this can be approached easily. If you see any problems with this class please let us know.
Usage:
AudioPool ap = new AudioPool();
File root = Environment.getExternalStorageDirectory() ;
int id1 = ap.addAudio(root + "/gong1.mp3");
int id2 = ap.addAudio(root + "/gong2.mp3");
int id3 = ap.addAudio(root + "/gong3.mp3");
ap.playAudio(id1);
ap.playAudio(id3);
ap.playAudio(id3);
ap.playAudio(id2);
which will play gong1 -> gong3 -> gong3 -> gong1 subsequently. As this is basically what I need I leave it here ...
import java.util.HashMap;
import java.util.LinkedList;
import java.util.Map;
import android.media.MediaPlayer;
import android.media.MediaPlayer.OnCompletionListener;
import android.util.Log;
public class AudioPool {
static String TAG = "AudioPool";
MediaPlayer mPlayer;
int mAudioCounter;
int mCurrentId;
HashMap<Integer, String> mAudioMap;
LinkedList<Integer> mAudioQueue;
public AudioPool() {
mAudioMap = new HashMap<Integer, String>();
mAudioQueue = new LinkedList<Integer>();
mAudioCounter = 0;
}
public int addAudio(String path) {
Log.d(TAG, "adding audio " + path + " to the pool");
if (mAudioMap.containsValue(path)) {
return getAudioKey(path);
}
mAudioCounter++;
mAudioMap.put(mAudioCounter, path);
return mAudioCounter;
}
public boolean playAudio(int id) {
if (mAudioMap.containsKey(id) == false) {
return false;
}
if (mPlayer == null) {
setupPlayer();
}
if (mPlayer.isPlaying() == false) {
return prepareAndPlayAudioNow(id);
} else {
Log.d(TAG, "adding audio " + id + " to the audio queue");
mAudioQueue.add(id);
}
return true;
}
public Integer[] getAudioIds() {
return (Integer[]) mAudioMap.keySet().toArray(
new Integer[mAudioMap.keySet().size()]);
}
public void releaseAudioPlayer() {
if (mPlayer != null) {
mPlayer.release();
mPlayer = null;
}
}
private boolean prepareAndPlayAudioNow(int id) {
mCurrentId = id;
try {
Log.d(TAG, "playing audio " + id + " now");
mPlayer.reset();
mPlayer.setDataSource(mAudioMap.get(id));
mPlayer.prepare();
mPlayer.start();
return true;
} catch (Exception e) {
Log.d(TAG, "problems playing audio " + e.getMessage());
return false;
}
}
private boolean playAudioAgainNow() {
try {
mPlayer.seekTo(0);
mPlayer.start();
return true;
} catch (Exception e) {
Log.d(TAG, "problems playing audio");
return false;
}
}
private void setupPlayer() {
mPlayer = new MediaPlayer();
mPlayer.setOnCompletionListener(new OnCompletionListener() {
#Override
public void onCompletion(MediaPlayer mp) {
audioDone();
}
});
}
private void audioDone() {
if (mAudioQueue.size() > 0) {
Log.d(TAG, mAudioQueue.size() + " audios in queue");
int nextId = mAudioQueue.removeFirst();
if (mCurrentId == nextId) {
playAudioAgainNow();
} else {
prepareAndPlayAudioNow(nextId);
}
} else {
releaseAudioPlayer();
}
}
private int getAudioKey(String path) {
for (Map.Entry<Integer, String> map : mAudioMap.entrySet()) {
if (map.getValue().compareTo(path) == 0) {
return map.getKey();
}
}
return -1;
}
}
Thanks to dorjeduck for the solution, but his class based on MediaPlayer, which has huge latency.
What does it mean? It means that when you call these:
mPlayer.prepare();
mPlayer.start();
and actually hear the sound the delay is very noticable. For example when you need to play one track and immediately play another, you will hear delay even on high-end hardware.
The solution to load all bytes into memory before playing, and use AudioTrack to play that sound bytes.
I have written SoundPoolCompat which uses AudioTrack under the hood. You could pass custom bufferSize, and all data within that buffer will be loaded into memory and played with small latency like SoundPool does. All data that exceed that bufferSize will be loaded on demand (which adds latency, similar to MediaPlayer). Api is very similart to SoundPool, also it is added a feature to load sounds from Uri (for example gdrive). And there is playOnce method, all resources will be unloaded after file is played.
implementation 'com.olekdia:sound-pool:3.0.2'
https://gitlab.com/olekdia/common/libraries/sound-pool
EDIT:
Android 2.2 MediaPlayer is working fine with one SHOUTcast URL but not with the other one
I need to play audio files from external URLs(shoutcast stream). Currently the audio files are downloaded incrementally & are played as soon as we get enough audio in phone local temporary storage. i am using the StreamingMediaPlayer class.
Check this piece of code:
private MediaPlayer createMediaPlayer(File mediaFile)
throws IOException {
MediaPlayer mPlayer = new MediaPlayer();
//example of mediaFile =/data/data/package/cache/playingMedia0.dat
FileInputStream fis = new FileInputStream(mediaFile);
mPlayer.setDataSource(fis.getFD());
mPlayer.prepare();
return mPlayer;
}
Current status:
1- It works fine from Android 1.6 to 2.1 but not in the higher versions like Android 2.2.
2- The "mPlayer.setDataSource(fis.getFD())" is the line which throws the error.
3- The error is "Unable to to create media player"
Other Solution tried:
I tried below alternate solution but nothing worked so far.
Android 2.2 MediaPlayer is working fine with one SHOUTcast URL but not with the other one
What i am looking for?
My goal is to have a peace of code which can work on Android 2.1 & higher.
This issue is also discussed here:
1- Inconsistent 2.2 Media Player Behavior
2- android code for streaming shoutcast stream breaks in 2.2
3- This issue is also discussed in a lot of questions on this site, but i found the answer no where.
4- markmail.org
LogCat trace:
Unable to to create media player
Error copying buffered conent.
java.lang.NullPointerException
com.ms.iradio.StreamingMediaPlayer.startMediaPlayer(StreamingMediaPlayer.java:251)
com.ms.iradio.StreamingMediaPlayer.access$2(StreamingMediaPlayer.java:221)
com.ms.iradio.StreamingMediaPlayer$2.run(StreamingMediaPlayer.java:204)
android.os.Handler.handleCallback(Handler.java:587)
android.os.Handler.dispatchMessage(Handler.java:92)
android.os.Looper.loop(Looper.java:123)
android.app.ActivityThread.main(ActivityThread.java:3683)
java.lang.reflect.Method.invokeNative(Native Method)
java.lang.reflect.Method.invoke(Method.java:507)
com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:839)
com.android.internal.os.ZygoteInit.main(ZygoteInit.java:597)
dalvik.system.NativeStart.main(Native Method)
The problem is that content type "audio/aacp" streaming is not supported directly. Some decoding libraries can be used to play "aacp", please see the solution below:
Freeware Advanced Audio (AAC) Decoder for Android
How to use this library?
Consider legal issues while using it.
[T]he project http://code.google.com/p/aacplayer-android/ is licensed
under GPL, so you can create commercial apps on top of it, but you
need to fullfill the GPL - mainly it means to publish your code as
well. If you use the second project
http://code.google.com/p/aacdecoder-android/ , then you do not need to
publish your code (the library is licensed under LGPL).
The StreamingMediaPlayer class is using a double-buffering technique to get around limitations in pre-1.2 releases of Android. All production releases of Android OS have included a MediaPlayer that supports streaming media(1). I would recommend doing that rather than using this double-buffering technique to get around the problem.
Android OS 2.2 replaced the old media player code with the FrightCast player which probably is acting differently in this case.
The line numbers in your stack trace don't map to the file you link to, so I assume there's a different version that you're actually using. I'm going to guess that that NullPointerException is being reported by MediaPlayer but neither the FileInputStream nor the returned FileDescriptor can be null.
(1) Prior to version 2.2 the media player wouldn't recognize ShoutCast streams with an "ICY/1.1" version header in the response. By creating a proxy that replaces this with "HTTP/1.1" you can resolve that. See the StreamProxy class here for an example.
i am using this code and run 2.2 to upper version for streaming downloaded.
import java.io.BufferedInputStream;
import java.io.BufferedOutputStream;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.net.URL;
import java.net.URLConnection;
import android.content.Context;
import android.media.MediaPlayer;
import android.os.Environment;
import android.os.Handler;
import android.util.Log;
import android.view.View;
import android.widget.Button;
import android.widget.ImageButton;
import android.widget.ProgressBar;
import android.widget.TextView;
public class StreamingMediaPlayer {
private static final int INTIAL_KB_BUFFER = 96*10;//assume 96kbps*10secs/8bits per byte
private TextView textStreamed;
private ImageButton playButton;
private ProgressBar progressBar;
ProgressBar pb;
int audiofiletime=0;
private long mediaLengthInSeconds;
private int totalKbRead = 0;
int totalsize=0;
int numread;
int totalBytesRead = 0;
private final Handler handler = new Handler();
private MediaPlayer mediaPlayer;
private File downloadingMediaFile;
private boolean isInterrupted;
private Context context;
private int counter = 0;
public StreamingMediaPlayer(Context context,TextView textStreamed, ImageButton playButton, Button streamButton,ProgressBar progressBar,ProgressBar pb)
{
this.context = context;
this.textStreamed = textStreamed;
this.playButton = playButton;
this.progressBar = progressBar;
this.pb=pb;
}
/**
* Progressivly download the media to a temporary location and update the MediaPlayer as new content becomes available.
*/
public void startStreaming(final String mediaUrl) throws IOException {
//this.mediaLengthInSeconds = 100;
Runnable r = new Runnable() {
public void run() {
try {
downloadAudioIncrement(mediaUrl);
} catch (IOException e) {
Log.e(getClass().getName(), "Unable to initialize the MediaPlayer for fileUrl=" + mediaUrl, e);
return;
}
}
};
new Thread(r).start();
}
/**
* Download the url stream to a temporary location and then call the setDataSource
* for that local file
*/
#SuppressWarnings({ "resource", "unused" })
public void downloadAudioIncrement(String mediaUrl) throws IOException {
URLConnection cn = new URL(mediaUrl).openConnection();
cn.connect();
InputStream stream = cn.getInputStream();
if (stream == null) {
Log.e(getClass().getName(), "Unable to create InputStream for mediaUrl:" + mediaUrl);
}
///////////////////save sdcard///////////////
File direct = new File(Environment.getExternalStorageDirectory()+"/punya");
if(!direct.exists()) {
if(direct.mkdir()); //directory is created;
}
String[] files=mediaUrl.split("/");
String fileName=files[files.length-1];
fileName = fileName.replace(".m4a", ".rdo");
//create a new file, to save the downloaded file
File file = new File(direct,fileName);
#SuppressWarnings("resource")
FileOutputStream fileOutput = new FileOutputStream(file);
///////////////////end/////////////////
totalsize=cn.getContentLength();
//mediaLengthInKb = 10000;
downloadingMediaFile = new File(context.getCacheDir(),fileName);
if (downloadingMediaFile.exists()) {
downloadingMediaFile.delete();
}
FileOutputStream out = new FileOutputStream(downloadingMediaFile);
byte buf[] = new byte[16384];
int incrementalBytesRead = 0;
do {
numread = stream.read(buf);
if (numread <= 0)
break;
out.write(buf, 0, numread);
fileOutput.write(buf, 0, numread);
totalBytesRead += numread;
incrementalBytesRead += numread;
totalKbRead = totalBytesRead/1000;
// pb.setMax(100);
// pb.setProgress(totalKbRead);
testMediaBuffer();
fireDataLoadUpdate();
} while (validateNotInterrupted());
stream.close();
if (validateNotInterrupted()) {
fireDataFullyLoaded();
}
}
private boolean validateNotInterrupted() {
if (isInterrupted) {
if (mediaPlayer != null) {
mediaPlayer.pause();
//mediaPlayer.release();
}
return false;
} else {
return true;
}
}
/**
* Test whether we need to transfer buffered data to the MediaPlayer.
* Interacting with MediaPlayer on non-main UI thread can causes crashes to so perform this using a Handler.
*/
private void testMediaBuffer() {
Runnable updater = new Runnable() {
public void run() {
if (mediaPlayer == null) {
// Only create the MediaPlayer once we have the minimum buffered data
if ( totalKbRead >= INTIAL_KB_BUFFER) {
try {
startMediaPlayer();
} catch (Exception e) {
Log.e(getClass().getName(), "Error copying buffered conent.", e);
}
}
} else if ( mediaPlayer.getDuration() - mediaPlayer.getCurrentPosition() <= 1000 ){
// NOTE: The media player has stopped at the end so transfer any existing buffered data
// We test for < 1second of data because the media player can stop when there is still
// a few milliseconds of data left to play
transferBufferToMediaPlayer();
}
}
};
handler.post(updater);
}
private void startMediaPlayer() {
try {
//File bufferedFile = new File(context.getCacheDir(),"playingMedia" + (counter++) + ".m4a");
//moveFile(downloadingMediaFile,bufferedFile);
// Log.e(getClass().getName(),"Buffered File path: " + bufferedFile.getAbsolutePath());
// Log.e(getClass().getName(),"Buffered File length: " + bufferedFile.length()+"");
mediaPlayer = createMediaPlayer(downloadingMediaFile);
//mediaPlayer.start();
startPlayProgressUpdater();
//playButton.setEnabled(true);
playButton.setVisibility(View.VISIBLE);
} catch (IOException e) {
Log.e(getClass().getName(), "Error initializing the MediaPlayer.", e);
return;
}
}
private MediaPlayer createMediaPlayer(File mediaFile)
throws IOException {
MediaPlayer mPlayer = new MediaPlayer();
mPlayer.setOnErrorListener(
new MediaPlayer.OnErrorListener() {
public boolean onError(MediaPlayer mp, int what, int extra) {
Log.e(getClass().getName(), "Error in MediaPlayer: (" + what +") with extra (" +extra +")" );
return false;
}
});
FileInputStream fis = new FileInputStream(mediaFile);
mPlayer.setDataSource(fis.getFD());
mPlayer.prepare();
return mPlayer;
}
/**
* Transfer buffered data to the MediaPlayer.
* NOTE: Interacting with a MediaPlayer on a non-main UI thread can cause thread-lock and crashes so
* this method should always be called using a Handler.
*/
private void transferBufferToMediaPlayer() {
try {
boolean wasPlaying = mediaPlayer.isPlaying();
int curPosition = mediaPlayer.getCurrentPosition();
File oldBufferedFile = new File(context.getCacheDir(),"playingMedia" + counter + ".m4a");
File bufferedFile = new File(context.getCacheDir(),"playingMedia" + (counter++) + ".m4a");
bufferedFile.deleteOnExit();
moveFile(downloadingMediaFile,bufferedFile);
//mediaPlayer.pause();
mediaPlayer.release();
mediaPlayer = createMediaPlayer(bufferedFile);
mediaPlayer.seekTo(curPosition);
boolean atEndOfFile = mediaPlayer.getDuration() - mediaPlayer.getCurrentPosition() <= 1000;
if (wasPlaying || atEndOfFile){
mediaPlayer.start();
}
oldBufferedFile.delete();
}catch (Exception e) {
Log.e(getClass().getName(), "Error updating to newly loaded content.", e);
}
}
private void fireDataLoadUpdate() {
Runnable updater = new Runnable() {
public void run() {
//float loadProgress = ((float)totalBytesRead/(float)mediaLengthInKb);
//float per = ((float)numread/mediaLengthInKb) * 100;
float per = ((float)totalBytesRead/totalsize) * 100;
textStreamed.setText((totalKbRead + " Kb (" + (int)per + "%)"));
progressBar.setSecondaryProgress((int)(per));
pb.setSecondaryProgress((int)(per));
}
};
handler.post(updater);
}
private void fireDataFullyLoaded() {
Runnable updater = new Runnable() {
public void run() {
transferBufferToMediaPlayer();
downloadingMediaFile.delete();
textStreamed.setText(("Download completed" ));
}
};
handler.post(updater);
}
public MediaPlayer getMediaPlayer() {
return mediaPlayer;
}
public void startPlayProgressUpdater() {
audiofiletime =mediaPlayer.getDuration();
float progress = (((float)mediaPlayer.getCurrentPosition()/ audiofiletime) * 100);
progressBar.setProgress((int)(progress));
//pb.setProgress((int)(progress*100));
if (mediaPlayer.isPlaying()) {
Runnable notification = new Runnable() {
public void run() {
startPlayProgressUpdater();
}
};
handler.postDelayed(notification,1000);
}
}
public void interrupt() {
playButton.setEnabled(false);
isInterrupted = true;
validateNotInterrupted();
}
/**
* Move the file in oldLocation to newLocation.
*/
public void moveFile(File oldLocation, File newLocation)
throws IOException {
if ( oldLocation.exists( )) {
BufferedInputStream reader = new BufferedInputStream( new FileInputStream(oldLocation) );
BufferedOutputStream writer = new BufferedOutputStream( new FileOutputStream(newLocation, false));
try {
byte[] buff = new byte[5461];
int numChars;
while ( (numChars = reader.read( buff, 0, buff.length ) ) != -1) {
writer.write( buff, 0, numChars );
}
} catch( IOException ex ) {
throw new IOException("IOException when transferring " + oldLocation.getPath() + " to " + newLocation.getPath());
} finally {
try {
if ( reader != null ){
writer.close();
reader.close();
}
} catch( IOException ex ){
Log.e(getClass().getName(),"Error closing files when transferring " + oldLocation.getPath() + " to " + newLocation.getPath() );
}
}
} else {
throw new IOException("Old location does not exist when transferring " + oldLocation.getPath() + " to " + newLocation.getPath() );
}
}
}