I realized after going through a post in the IBM developer forums that the android sdk reads bytes from the mic recording and writes them to the websocket. I am now trying to read bytes from an audio file on memory and write them to the websocket. How should I do this? So far I have:
public class AudioCaptureThread extends Thread{
private static final String TAG = "AudioCaptureThread";
private boolean mStop = false;
private boolean mStopped = false;
private int mSamplingRate = -1;
private IAudioConsumer mIAudioConsumer = null;
// the thread receives high priority because it needs to do real time audio capture
// THREAD_PRIORITY_URGENT_AUDIO = "Standard priority of the most important audio threads"
public AudioCaptureThread(int iSamplingRate, IAudioConsumer IAudioConsumer) {
android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);
mSamplingRate = iSamplingRate;
mIAudioConsumer = IAudioConsumer;
}
// once the thread is started it runs nonstop until it is stopped from the outside
#Override
public void run() {
File path = Activity.getContext.getExternalFilesDir(null);
File file = new File (path, "whatstheweatherlike.wav");
int length = (int) file.length();
ByteArrayOutputStream bos = new ByteArrayOutputStream();
byte[] b = new byte[length];
FileInputStream in = null;
try {
in = new FileInputStream(file);
} catch (FileNotFoundException e) {
e.printStackTrace();
}
try {
for (int readNum; (readNum = in.read(b)) != -1;) {
bos.write(b, 0, readNum);
}
} catch (IOException e) {
e.printStackTrace();
}
byte[] bytes = bos.toByteArray();
mIAudioConsumer.consume(bytes);
}
However, Activity.getContext is not recognized. I can convert the file to bytes in MainActivity but how do I then write it to the websocket? Am I on the right track or is this not the right way? If it is, how do I solve this problem?
Any help is appreciated!
Activity.getContext is not recognized because there's no reference to Activity, since it's just a Thread. You would have to pass in the Activity, although it would likely make more sense just to pass in the Context if you need it.
You've got the right idea that you can create a FileInputStream and use that. You might like to use our MicrophoneCaptureThread as a reference. It'd be a very similar situation, except you'd be using your FileInputStream instead of reading from the microphone. You can check it out (and an example project that uses it) here: https://github.com/watson-developer-cloud/android-sdk/blob/master/library/src/main/java/com/ibm/watson/developer_cloud/android/library/audio/MicrophoneCaptureThread.java
Related
First off, thanks in advance for any help you can provide.
My issue is, hopefully, one that can be resolved. I have an app that, basically, allows a user to input data and then sends that data via email as an attachment. What I would like to do, is if the user is connected to their network via wifi, that instead of sending the file via email, it would transfer the file to a network share. I have been searching for an answer for quite a while but unfortunately have found no way of doing this.
So I guess my real question is if this is even possible, and if so, how to go about doing this.
You would need to copy the file accordingly as noted below and in your instance I would presume the dest File would get set up as such...
new File("\\\\server\\path\\to\\file.txt")
class FileUtils {
public static boolean copyFile(File source, File dest) {
BufferedInputStream bis = null;
BufferedOutputStream bos = null;
try {
bis = new BufferedInputStream(new FileInputStream(source));
bos = new BufferedOutputStream(new FileOutputStream(dest, false));
byte[] buf = new byte[1024];
bis.read(buf);
do {
bos.write(buf);
} while(bis.read(buf) != -1);
} catch (IOException e) {
return false;
} finally {
try {
if (bis != null) bis.close();
if (bos != null) bos.close();
} catch (IOException e) {
return false;
}
}
return true;
}
// WARNING ! Inefficient if source and dest are on the same filesystem !
public static boolean moveFile(File source, File dest) {
return copyFile(source, dest) && source.delete();
}
// Returns true if the sdcard is mounted rw
public static boolean isSDMounted() {
return Environment.getExternalStorageState().equals(Environment.MEDIA_MOUNTED);
}
}
I am 100% sure the problem here is with the actual image. However I hope that the solution is some attribute of the image that will help others in the future.
The image:
the photo in question http://soundwave.robotsidekick.com/mlsphotos.jpg
I have tried loading this image in several ways. I have downloaded it and tried loading it in an ImageView:
final ImageView v = (ImageView) findViewById(R.id.image);
v.setImageResource(R.drawable.photo);
v.invalidate();
I have tried loading it from a url:
final String[] params = new String[] {
"",
};
(new AsyncTask<String, Bitmap, Bitmap>()
{
#Override
protected Bitmap doInBackground(final String... params)
{
Bitmap ret = null;
for (final String url : params)
{
try
{
Log.e(TAG, url);
ret = BitmapFactory.decodeStream((new URL(url)).openStream());
publishProgress(ret);
}
catch (final MalformedURLException e)
{
Log.e(TAG, "Malformed URL", e);
}
catch (final IOException e)
{
Log.e(TAG, "IO Exception", e);
}
}
return ret;
}
#Override
protected void onProgressUpdate(final Bitmap... values)
{
super.onProgressUpdate(values);
for (final Bitmap result : values)
{
if (result != null)
{
final ImageView v = (ImageView) MainActivity.this.findViewById(R.id.image);
v.setImageBitmap(result);
v.invalidate();
}
}
}
}).execute(params);
I have also tried loading the image in a WebView like this:
final WebView webview = (WebView) findViewById(R.id.webview);
webview.loadData("<html><body><img src=\"" + url + "\"></body></html>", "text/html", "utf-8");
webview.invalidate();
I have also tried loading the image in Browser (the app) and that does not work.
None of those work, HOWEVER if I load the url into Chrome on Android it works great (not in Browser), if I load the image on my desktop browser (Chrome, Firefox, etc) it loads great. I have checked that the mime type matches the extension and I am just at a loss.
EDIT
There is a work around for images coming from an InputStream where the bitmap processing runs out of data on the stream before the stream completes. The work around is documented in this question and this bug.
However this is a corrupt image whose data ends prematurely. I know that means I am already down a broken track, but I am hoping to have some better error handling than Android passing me back null and I lose. iOS, Chrome (on device and computer) as well as most other places seem to have much better error handling. Is there something I can do on Android to handle corrupt jpgs?
EDIT 2
There has to be a solution here because Chrome on my device handles this situation elegantly. However the closest I can come to fixing this is the following code:
final InputStream is = (new URL(url)).openStream();
final ByteArrayOutputStream bos = new ByteArrayOutputStream();
final int size = 1024;
int len = -1;
byte[] buf = new byte[size];
while ((len = is.read(buf, 0, size)) != -1)
{
bos.write(buf, 0, len);
}
buf = bos.toByteArray();
// buf is now filled with the corrupted bytes of the image
ret = BitmapFactory.decodeByteArray(buf, 0, buf.length);
// ret is null because it was a corrupt jpg
With that code I can check if there are bytes and the image wasn't decoded. Then at least I can tell I have a corrupt image (or not an image) and can report something slightly more useful to the user (like hey I have an image here with 16K but I sure don't know what to do with it).
Anyone know how Chrome manages to decode as much of the image as they can before they hit the corruption?
I opened the file in Photoshop CS6 and it said that the file may be damaged, possibly truncated or incomplete. The file can be opened. If I save it in Photoshop without making any changes, it then works in Android. I'm afraid I don't know exactly what's wrong with the image though.
Here is the important bit about JPGs from Wikipedia and here's a question that ultimate led me to the solution.
I just appended the two closing jpeg end of image bytes to the stream, in order to convince the decoder that the stream is done with image data. This method is flawed because JPGs can have JPGs inside them, meaning appending one set of end of image bytes, doesn't guarantee that we closed all the images.
In both solutions below I assume is is an input stream for a JPG image. Also these two constants are defined:
private static final int JPEG_EOI_1 = 0xFF;
private static final int JPEG_EOI_2 = 0xD9;
This method we read all the bytes into memory then try to decode the bytes:
final ByteArrayOutputStream bos = new ByteArrayOutputStream();
final int size = 1024;
int len = -1;
final byte[] buf = new byte[size];
try
{
while ((len = is.read(buf, 0, size)) != -1)
{
bos.write(buf, 0, len);
}
bos.write(JPEG_EOI_1);
bos.write(JPEG_EOI_2);
final byte[] bytes = bos.toByteArray();
return BitmapFactory.decodeByteArray(bytes, 0, bytes.length);
}
catch (final IOException ex)
{
return null;
}
catch (final Exception ex)
{
return null;
}
This method creates a stream wrapper that makes sure the last two bytes are JPG end of image bytes:
return BitmapFactory.decodeStream(new JpegClosedInputStream(is));
// And here's the stream wrapper
class JpegClosedInputStream extends InputStream
{
private final InputStream inputStream;
private int bytesPastEnd;
private JpegClosedInputStream(final InputStream iInputStream)
{
inputStream = iInputStream;
bytesPastEnd = 0;
}
#Override
public int read() throws IOException
{
int buffer = inputStream.read();
if (buffer == -1)
{
if (bytesPastEnd > 0)
{
buffer = JPEG_EOI_2;
}
else
{
++bytesPastEnd;
buffer = JPEG_EOI_1;
}
}
return buffer;
}
}
do setcontentview and show it from the xml like that:
//photo.xml
<ImageView
android:id="#+id/picture"
android:layout_width="250dp"
android:layout_height="250dp"
android:layout_gravity="center"
android:src="#drawable/photo" />
/>
//Photo.java
setContentView(R.layout.photo);
I started developing an android app that have to interact with MMS attachements, in particular, get attachements such as text, bitmaps, audio, video etc. and store them on the phone in a specific folder.
So i started reading some books and some post on the web but it isn't a very common argument, and i didn't find an official way to do what i want to do.
I found a fairly good article here on stack-overflow here: How to Read MMS Data in Android?... it works very well for me, but there are 2 problems:
The article shows you how to get MMS data by querying over the "hidden" SMS-MMS content provider, and as far as i know, Google doesn't guarantee that they'll keep the current structure in every android's future relase.
The article only explains how to get Text data and Bitmap data from MMS...what about video/audio? I tried to get a video/audio stream from an InputStream such as the example did with Bitmaps, unfortunately with no luck...
I'm very disappointed about the absence of official tutorial or "How-To" over this argument because SMS and MMS management is a very common need in mobile developement.
I hope someone can help me....
Thanks in advance!!
I found a fairly simple way to read Video/Audio data from MMS, so i decided to publish this part of my class that provides MMS attachements, for all users that need this.
private static final int RAW_DATA_BLOCK_SIZE = 16384; //Set the block size used to write a ByteArrayOutputStream to byte[]
public static final int ERROR_IO_EXCEPTION = 1;
public static final int ERROR_FILE_NOT_FOUND = 2;
public static byte[] LoadRaw(Context context, Uri uri, int Error){
InputStream inputStream = null;
byte[] ret = new byte[0];
//Open inputStream from the specified URI
try {
inputStream = context.getContentResolver().openInputStream(uri);
//Try read from the InputStream
if(inputStream!=null)
ret = InputStreamToByteArray(inputStream);
}
catch (FileNotFoundException e1) {
Error = ERROR_FILE_NOT_FOUND;
}
catch (IOException e) {
Error = ERROR_IO_EXCEPTION;
}
finally{
if (inputStream != null) {
try {
inputStream.close();
}
catch (IOException e) {
//Problem on closing stream.
//The return state does not change.
Error = ERROR_IO_EXCEPTION;
}
}
}
//Return
return ret;
}
//Create a byte array from an open inputStream. Read blocks of RAW_DATA_BLOCK_SIZE byte
private static byte[] InputStreamToByteArray(InputStream inputStream) throws IOException{
ByteArrayOutputStream buffer = new ByteArrayOutputStream();
int nRead;
byte[] data = new byte[RAW_DATA_BLOCK_SIZE];
while ((nRead = inputStream.read(data, 0, data.length)) != -1) {
buffer.write(data, 0, nRead);
}
buffer.flush();
return buffer.toByteArray();
}
In this way you can extract "Raw" data such as Audio/Video/Images from MMS by passing:
the context where you need to use this function
the URI of the MMS part that contains data you want to extract (for ex. "content://mms/part/2")
the byref param that returns an eventual error code thrown by the procedure.
Once you have your byte[], you can create an empty File and then use a FileOutputStream to write the byte[] into it. If the file path\extension is correct and if your app has all the right
permissions, you'll be able to store your data.
PS. This procedure has been tested a few times and it worked, but i don't exclude can be some unmanaged exception cases that may produce error states. IMHO it can be improoved too...
Currently doing project on live Streaming, and I succeed to play live video. Now my next task is to record the video which is playing in VideoView.
I had searched, able to found capturing video but with surface(camera) but here in VideoView I am not having any surface.
any help appreciated
You can see this link. In short your server has to support downloading. If it does, you can try the following code:
private final int TIMEOUT_CONNECTION = 5000; //5sec
private final int TIMEOUT_SOCKET = 30000; //30sec
private final int BUFFER_SIZE = 1024 * 5; // 5MB
private final int TIMEOUT_CONNECTION = 5000; //5sec
private final int TIMEOUT_SOCKET = 30000; //30sec
private final int BUFFER_SIZE = 1024 * 5; // 5MB
try {
URL url = new URL("http://....");
//Open a connection to that URL.
URLConnection ucon = url.openConnection();
ucon.setReadTimeout(TIMEOUT_CONNECTION);
ucon.setConnectTimeout(TIMEOUT_SOCKET);
// Define InputStreams to read from the URLConnection.
// uses 5KB download buffer
InputStream is = ucon.getInputStream();
BufferedInputStream in = new BufferedInputStream(is, BUFFER_SIZE);
FileOutputStream out = new FileOutputStream(file);
byte[] buff = new byte[BUFFER_SIZE];
int len = 0;
while ((len = in.read(buff)) != -1)
{
out.write(buff,0,len);
}
} catch (IOException ioe) {
// Handle the error
} finally {
if(in != null) {
try {
in.close();
} catch (Exception e) {
// Nothing you can do
}
}
if(out != null) {
try {
out.flush();
out.close();
} catch (Exception e) {
// Nothing you can do
}
}
}
If the server doesn't support downloading, there is nothing you can do.
You can use platform-tools and record video using:
adb shell screenrecord --verbose /sdcard/demo.mp4
Replace Demo with whatever file name you want.
Also this will be placed on your phone, and defaults to 6 minutes I believe.
Check out the options of screen record.
To pull the file to your computer.... (the following command, or use Android Device Monitor
adb pull /sdcard/demo.mp4
I have used this to record demo's of apps, and even played youtube, and had it record that.
It does not have audio, so that may be a major problem.
But this is included in the sdk, and records any screen showing while it is recording.
I need some input about my code.
Basically, I have a method to load music from Class A
public void onListItemClick(ListView parent, View v, int position, long id){
musicIndex = cursor.getColumnIndexOrThrow(MediaStore.Audio.Media.DATA);
cursor.moveToPosition(position);
filePath = cursor.getString(musicIndex);
fileName = new File(filePath).getName();
playMusic();//Play the selected music
}
public void playMusic(){
if(mPlayer.isPlaying()){
mPlayer.reset();
}
try{
mPlayer.setDataSource(filePath);
mPlayer.prepare();
mPlayer.start();
BeatDetection beatDetect = new BeatDetection();
beatDetect.init();
}catch (Exception e){
}
}
That method will call the init() method in Class B
public void init() throws Exception{
energy = 0;
variance = 0;
constant = 0;
isBeat = false;
sensitivity = 0;
dBuffer = new float[sampleRate / bufferSize];
eBuffer = new float[sampleRate / bufferSize];
timer = System.currentTimeMillis();
MusicLoad msc = new MusicLoad();
totalMs = 0;
seeking = true;
//msc.printText();
decode(msc.fileName, 25, 40);
}
In that method, it initializes everything and call the decode() method
public void decode(String path, int startMs, int maxMs)
throws IOException, javazoom.jl.decoder.DecoderException {
debug();
File in = new File(path);
InputStream inStream = new BufferedInputStream(new FileInputStream(in), 8 * 1024);
ByteArrayOutputStream outStream = new ByteArrayOutputStream(1024);
try {
Bitstream bitstream = new Bitstream(inStream);
Decoder decoder = new Decoder();
boolean done = false;
while (! done) {
Header frameHeader = bitstream.readFrame();
if (frameHeader == null) {
done = true;
} else {
totalMs += frameHeader.ms_per_frame();
if (totalMs >= startMs) {
seeking = false;
}
if (! seeking) {
SampleBuffer output = (SampleBuffer) decoder.decodeFrame(frameHeader, bitstream);
if (output.getSampleFrequency() != 44100 || output.getChannelCount() != 2) {
throw new javazoom.jl.decoder.DecoderException("mono or non-44100 MP3 not supported", null);
}
short[] pcm = output.getBuffer();
for (short s : pcm) {
outStream.write(s & 0xff);
outStream.write((s >> 8 ) & 0xff);
}
}
if (totalMs >= (startMs + maxMs)) {
done = true;
}
}
bitstream.closeFrame();
}
byte[] abAudioData = outStream.toByteArray();
calculation(abAudioData);
} catch (BitstreamException e) {
throw new IOException("Bitstream error: " + e);
} catch (DecoderException e) {
Log.w("Decoder error", e);
throw new javazoom.jl.decoder.DecoderException("Error",e);
} finally {
inStream.close();
}
}
Don't mind reading all the code lines. If you guys notice I put debug() in the beginning to see whether the method is called or not. At this point, the debug() is properly called. However, if I put the debug() after the line File in = new File(path);, the debug() will not be called anymore. It seems like the code is stop running at that point.
The ultimate result is, I can load and play the song without any problem. However, the decode() is not called and there is no error whatsoever. I'm stuck at pointing out the problem at this point. So if there's any input please help me.
EDIT: After I tried tracing the "path" variable, it returns NULL so the error is NullPointerException. Seems like the "fileName" variable from Class A is not passed to Class B. Any suggestion?
If you are using Eclipse with ADT then it's very easy to debug your Android apps, just add a breakpoint (probably in the new File(...) line) and see what happens.
My guess here is that File in = new File(path); probably is throwing a IOException in your decode method, that exception is bubbling first to init() and then to playMusic(), where it is caught by try catch block. Your catch is empty so you are not seeing anything. Try debugging as I said or add some logging info in the catch block.
This is just something to look at, but from the doc page
http://developer.android.com/reference/java/io/File.html#File%28java.lang.String%29
"The actual file referenced by a File may or may not exist. It may also, despite the name File, be a directory or other non-regular file."
If you had the path wrong, it may be trying to create the file and you may not have the correct permission to do so. Perhaps: WRITE_EXTERNAL_STORAGE.
I know this post is old, but I just wanted to show how to get the file path to read/write files for others that come across this post as I have:
String filePath = myContext.getFilesDir().getPath().toString() + "/sysout.log";
File file = new File(filePath);
These two lines will create (open if it exists, and overwrite) a file named "sysout.log" in the folder /data/data/com.app.name/files/; myContext is just the current context. Using this technique alleviates problems with defining your own path name. Hope this helps someone.