How to decode Midi data Android - android

I'm writing an Android application. A MIDI piano keyboard is connected physically by a cable to an Android device. I have been following the official Android Midi documentation here https://developer.android.com/reference/android/media/midi/package-summary, but I am stuck with decoding the raw Midi data which I am receiving.
#RequiresApi(api = Build.VERSION_CODES.M)
class MidiFramer extends MidiReceiver {
public void onSend(byte[] data, int offset,
int count, long timestamp) throws IOException {
// parse MIDI or whatever
// How to convert data to something readable? Below doesn't make any sense.
Log.v(LOG_TAG, "onSend strData:" + data +" length:"+data.length);
StringBuffer sb = new StringBuffer();
for (int i=0; i<data.length; i++){
String hex = new String (data, StandardCharsets.UTF_8);
sb.append(hex);
}
Log.v(LOG_TAG, "onSend sb:" + sb.toString());
}
}
Essentially from the raw Midi data which is being received, I want to know what note is being played (e.g. D4 / C#5) on the physical piano keyboard. Any help would be appreciated.

Related

Android 6.0+: No Sound Using the New MIDI API

I am using the new MIDI API in order to play some MIDI notes. However, I am unable to hear any sound, nor any exception is being thrown. The code for the same is as follows:
//initialising the MidiReceiver
private MidiReceiver midiReceiver;
midiReceiver = new MidiReceiver() {
#Override
public void onSend(byte[] msg, int offset,
int count, long timestamp) throws IOException {
}
};
/*Then in my loop containing note_on or note_off events*/
byte[] buffer = new byte[32];
int numBytes = 0;
int channel = 2; // MIDI channels 1-16 are encoded as 0-15.
// NOTE_STATUS is either 0x90 or 0x80
buffer[numBytes++] = (byte)(NOTE_STATUS + (channel - 1));
buffer[numBytes++] = (byte)noteValue; // the required MIDI pitch
buffer[numBytes++] = (byte)127; // max velocity
int offset = 0;
midiReceiver.send(buffer, offset, numBytes);
What am I doing wrong here? I think it must be because the onSend method is empty. How do I use it in order to make my app play back the note(s) within the Android device?
I couldn't find any indication in the documentation that the new MIDI API actually let you synthesize audio, so, it seems you need to generate the sound yourself.
Maybe this library might be useful.

Android: Read audio data from file uri

I want to analyse an audio file (mp3 in particular) which the user can select and determine what notes are played, when they're player and with what frequency.
I already have some working code for my computer, but I want to be able to use this on my phone as well.
In order to do this however, I need access to the bytes of the audio file. On my PC I could just open a stream and use AudioFormat to decode it and then read() the bytes frame by frame.
Looking at the Android Developer Forums I can only find classes and examples for playing a file (without access to the bytes) or recording to a file (I want to read from a file).
I'm pretty confident that I can set up a file chooser, but once I have the Uri from that, I don't know how to get a stream or the bytes.
Any help would be much appreciated :)
Edit: Is a similar solution to this possible? Android - Read a File
I don't know if I could decode the audio file that way or if there would be any problems with the Android API...
So I solved it in the following way:
Get an InputStream with
final InputStream inputStream = getContentResolver().openInputStream(selectedUri);
Then pass it in this function and decode it using classes from JLayer:
private synchronized void decode(InputStream in)
throws BitstreamException, DecoderException {
ArrayList<Short> output = new ArrayList<>(1024);
Bitstream bitstream = new Bitstream(in);
Decoder decoder = new Decoder();
float total_ms = 0f;
float nextNotify = -1f;
boolean done = false;
while (! done) {
Header frameHeader = bitstream.readFrame();
if (total_ms > nextNotify) {
mListener.OnDecodeUpdate((int) total_ms);
nextNotify += 500f;
}
if (frameHeader == null) {
done = true;
} else {
total_ms += frameHeader.ms_per_frame();
SampleBuffer buffer = (SampleBuffer) decoder.decodeFrame(frameHeader, bitstream); // CPU intense
if (buffer.getSampleFrequency() != 44100 || buffer.getChannelCount() != 2) {
throw new DecoderException("mono or non-44100 MP3 not supported", null);
}
short[] pcm = buffer.getBuffer();
for (int i = 0; i < pcm.length-1; i += 2) {
short l = pcm[i];
short r = pcm[i+1];
short mono = (short) ((l + r) / 2f);
output.add(mono); // RAM intense
}
}
bitstream.closeFrame();
}
bitstream.close();
mListener.OnDecodeComplete(output);
}
The full project (in case you want to look up the particulars) can be found here:
https://github.com/S7uXN37/MusicInterpreterStudio/

How to play InputStream of an audio file that's not within a url or storage?

Background
I've succeeded uploading an audio file (3gp) into Google-Drive.
Now I want to be able to play the file within the app.
The Google Drive API only allows to get the input stream of the file that's stored there.
The problem
All MediaPlayer capabilities of inputs aren't available in my case, which is only InputSteam:
http://developer.android.com/reference/android/media/MediaPlayer.html#setDataSource(java.io.FileDescriptor)
I know I can save the file from the Google-Drive to the cache and play it, but I want to avoid the storage handling, and play the file on the fly.
What I've tried
I tried to search for this issue, and only found that it might be possible using AudioTrack (here). It might also be possible using new Jelly-Bean features (shown here, found from here), but I'm not sure as it's quite low level.
Sadly, using AudioTrack I got wrong sounds being played (noise).
I've also noticed that MediaPlayer has the option to set the dataSource to be MediaDataSource (here) , but not only I'm not sure how to use it, it also requires API 23 and above.
Of course, I tried using a url that is given in Google-Drive, but this is only used for other purposes and isn't being directed to the audio file, so it can't be used using MediaPlayer.
The question
Given an InputStream, is it possible to use AudioTrack or something else, to play an audio 3gp file ?
Is there maybe a support library solution for this?
If your minSdkVersion is 23 or higher, you can use setDataSource(MediaDataSource) and supply your own subclass of the abstract MediaDataSource class.
For older devices, you should be able to use a pipe created from ParcelFileDescriptor. You would have a thread that writes data to your end of the pipe, and pass the FileDescriptor (from getFileDescriptor()) for the player's end to setDataSource(FileDescriptor).
The simplest MediaDataSource implementation example:
import android.media.MediaDataSource;
import android.os.Build;
import android.support.annotation.RequiresApi;
import java.io.IOException;
import java.io.InputStream;
#RequiresApi(api = Build.VERSION_CODES.M)
public class InputStreamMediaDataSource extends MediaDataSource {
private InputStream is;
private long streamLength = -1, lastReadEndPosition;
public InputStreamMediaDataSource(InputStream is, long streamLength) {
this.is = is;
this.streamLength = streamLength;
if (streamLength <= 0){
try {
this.streamLength = is.available(); //Correct value of InputStream#available() method not always supported by InputStream implementation!
} catch (IOException e) {
e.printStackTrace();
}
}
}
#Override
public synchronized void close() throws IOException {
is.close();
}
#Override
public synchronized int readAt(long position, byte[] buffer, int offset, int size) throws IOException {
if (position >= streamLength)
return -1;
if (position + size > streamLength)
size -= (position + size) - streamLength;
if (position < lastReadEndPosition) {
is.close();
lastReadEndPosition = 0;
is = getNewCopyOfInputStreamSomeHow();//new FileInputStream(mediaFile) for example.
}
long skipped = is.skip(position - lastReadEndPosition);
if (skipped == position - lastReadEndPosition) {
int bytesRead = is.read(buffer, offset, size);
lastReadEndPosition = position + bytesRead;
return bytesRead;
} else {
return -1;
}
}
#Override
public synchronized long getSize() throws IOException {
return streamLength;
}
}
To use it with API >= 23 you have to provide streamLength value and if (when) MediaPlayer goes back - i.e. position < lastReadEndPosition, you have to know how to create a new copy of InputStream.
Usage example:
You have to create Activity, initialize MediaPlayer class (there are a lot of examples of a file playback) and place following code istead of old player.setDataSource("/path/to/media/file.3gp")
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.M) {
File file = new File("/path/to/media/file.3gp");//It is just an example! If you have real file on the phone memory, you don't need to wrap it to the InputStream to play it in MediaPlayer!
player.setDataSource(new InputStreamMediaDataSource(new FileInputStream(file), file.length()));
} else
player.setDataSource(this, mediaUri);
If your file is an object com.google.api.services.drive.model.File on Google Drive and com.google.api.services.drive.Drive drive you can get
InputStream is = drive.getRequestFactory().buildGetRequest(new GenericUrl(file.getDownloadUrl())).execute().getContent();
In the case of Build.VERSION.SDK_INT < Build.VERSION_CODES.MI had to setup HTTP sever on the local host of an Android device (by means of NanoHTTPd) and transfer a byte stream thru this server to MediaPlayer by uri - player.setDataSource(this, mediaUri)
For anyone interested in using a MediaDataSource implementation I've created one that reads ahead and caches buffers of data. It works from any InputStream, I've mainly created it for reading from networked files using JCIFS SmbFile's.
You can find it at https://github.com/SteveGreatApe/BufferedMediaDataSource
If you asking about media player which set our audio path maybe this code can help.
File directory = Environment.getExternalStorageDirectory();
File file = new File( directory + "/AudioRecorder" );
String AudioSavePathInDevice = file.getAbsolutePath() + "/" + "sample.wav" ;
mediaPlayer = new MediaPlayer();
try {
mediaPlayer.setDataSource(AudioSavePathInDevice);
mediaPlayer.prepare();
} catch (IOException e) {
e.printStackTrace();
}
mediaPlayer.start();

How to post large video on a server, in Android?

I am trying to post a large video (nearly 1 GB).
I am using FTP to send video to a server, but the upload stops after a while. On the server the video crashes, but I am able to upload a smaller sized video.
I've also used HTTP to send video to the server, sent as a Base64 enoded string, but there is an out of memory exception while encoding.
I've tried to upload the video as a file, but without success. What is the best way to upload a large video to a server?
Use HTTP POST, and post content as Form-based File Upload (mime type: multipart/form-data). This system is standard on web for sending forms and/or uploading files.
Use HTTP chunked post mode, so that size doesn't need to be known beforehand, and you can stream any file in small parts. You still have to make some code on server (e.g. in PHP) to accept the file and do what is needed.
Use HttpURLConnection to initiate connection. Then use my attached class to send the data. It will create proper headers, etc, and you'll use it as OutputStream to write your raw data to it, then call close, and you're done. You can overrite its onHandleResult to handle resulting error code.
public class FormDataWriter extends FilterOutputStream{
private final HttpURLConnection con;
/**
* #param formName name of form in which data are sent
* #param fileName
* #param fileSize size of file, or -1 to use chunked encoding
*/
FormDataWriter(HttpURLConnection con, String formName, String fileName, long fileSize) throws IOException{
super(null);
this.con = con;
con.setDoOutput(true);
String boundary = generateBoundary();
con.setRequestProperty(HTTP.CONTENT_TYPE, "multipart/form-data; charset=UTF-8; boundary="+boundary);
{
StringBuilder sb = new StringBuilder();
writePartHeader(boundary, formName, fileName==null ? null : "filename=\""+fileName+"\"",
"application/octet-stream", sb);
headerBytes = sb.toString().getBytes("UTF-8");
sb = new StringBuilder();
sb.append("\r\n");
sb.append("--"+boundary+"--\r\n");
footerBytes = sb.toString().getBytes();
}
if(fileSize!=-1) {
fileSize += headerBytes.length + footerBytes.length;
con.setFixedLengthStreamingMode((int)fileSize);
}else
con.setChunkedStreamingMode(0x4000);
out = con.getOutputStream();
}
private byte[] headerBytes, footerBytes;
private String generateBoundary() {
StringBuilder sb = new StringBuilder();
Random rand = new Random();
int count = rand.nextInt(11) + 30;
int N = 10+26+26;
for(int i=0; i<count; i++) {
int r = rand.nextInt(N);
sb.append((char)(r<10 ? '0'+r : r<36 ? 'a'+r-10 : 'A'+r-36));
}
return sb.toString();
}
private void writePartHeader(String boundary, String name, String extraContentDispositions, String contentType, StringBuilder sb) {
sb.append("--"+boundary+"\r\n");
sb.append("Content-Disposition: form-data; charset=UTF-8; name=\""+name+"\"");
if(extraContentDispositions!=null)
sb.append("; ").append(extraContentDispositions);
sb.append("\r\n");
if(contentType!=null)
sb.append("Content-Type: "+contentType+"\r\n");
sb.append("\r\n");
}
#Override
public void write(byte[] buffer, int offset, int length) throws IOException{
if(headerBytes!=null) {
out.write(headerBytes);
headerBytes = null;
}
out.write(buffer, offset, length);
}
#Override
public void close() throws IOException{
flush();
if(footerBytes!=null) {
out.write(footerBytes);
footerBytes = null;
}
super.close();
int code = con.getResponseCode();
onHandleResult(code);
}
protected void onHandleResult(int code) throws IOException{
if(code!=200 && code!=201)
throw new IOException("Upload error code: "+code);
}
}
I guess it failed because of a timeout by the big size.
Since
Small size video uploaded successfully
, My suggestion is
Split one big file to several small files.
Upload one by one or several together based on the condition of network.
Join all of parts (after all of those uploaded successfully) at server.
Because of small size, Re-upload failed part will be easy.
Just a theroy.
This site may help .
Added 08.01.2013
It has been a while, don't know if you still need this. Anyway, I wrote some simple codes implement the theory above, because of interest mainly.
Split one big file to several small files. Read the big file into several small parts.
ByteBuffer bb = ByteBuffer.allocate(partSize);
int bytesRead = fc.read(bb);
if (bytesRead == -1) {
break;
}
byte[] bytes = bb.array();
parts.put(new Part(createFileName(fileName, i), bytes));
Upload one by one or several together based on the condition of network.
Part part = parts.take();
if (part == Part.NULL) {
parts.add(Part.NULL);// notify others to stop.
break;
} else {
uploader.upload(part);
}
Join all of parts (after all of those uploaded successfully) at server.
Because it is via HTTP, so it can be in any language, such as Java, PHP, Python, etc. Here is a java example.
...
try (FileOutputStream dest = new FileOutputStream(destFile, true)) {
FileChannel dc = dest.getChannel();// the final big file.
for (long i = 0; i < count; i++) {
File partFile = new File(destFileName + "." + i);// every small parts.
if (!partFile.exists()) {
break;
}
try (FileInputStream part = new FileInputStream(partFile)) {
FileChannel pc = part.getChannel();
pc.transferTo(0, pc.size(), dc);// combine.
}
partFile.delete();
}
statusCode = OK;// set ok at last.
} catch (Exception e) {
log.error("combine failed.", e);
}
I put all codes on GitHub. And made a Android example too.
Please have a look if you still need.
private HttpsURLConnection conn = null;
conn.setDoInput(true);
conn.setDoOutput(true);
conn.setUseCaches(false);
conn.setChunkedStreamingMode(1024);

Android/PHP - Encryption and Decryption

I'm struggeling with code from this page: http://www.androidsnippets.com/encrypt-decrypt-between-android-and-php
I want to send data from a server to an Android application and vice versa, but it should be sent as an encrypted string. However, I manage to encrypt and decrypt the string in PHP. But on Android the application crashes with the following error message when decrypting:
java.lang.Exception: [decrypt] unable to parse ' as integer.
This error occours here in the for-loop:
public static byte[] hexToBytes(String str) {
if (str==null) {
return null;
} else if (str.length() < 2) {
return null;
} else {
int len = str.length() / 2;
byte[] buffer = new byte[len];
for (int i=0; i<len; i++) {
buffer[i] = (byte) Integer.parseInt(str.substring(i*2,i*2+2),16);
}
System.out.println("Buffer: " + buffer);
return buffer;
}
}
This is by the way the string that should be decrypted: f46d86e65fe31ed46920b20255dd8ea6
You're talking about encrypting and decrypting, but you're showing code which simply turns numeric bytes (such as 0x4F) into strings (such as "4F") -- which may be relevant to your transfer of data (if you cannot transfer binary format), but completely unrelated to encryption/decryption.
Since the Android code you have contains only a single Integer parse, have you examined the input you're giving it? str.substring(i*2,i*2+2) apparently contains data other than [0-9A-F] when the exception occurs. You should start by examining the string you've received and comparing it to what you sent, to make sure they agree and they only contain hexadecimal characters.
Edit -- passing the string "f46d86e65fe31ed46920b20255dd8ea6" through your hexToBytes() function works flawlessly. Your input is probably not what you think it is.

Categories

Resources