I developed an Android app with libpd ( [adc~]->[*~ 0.5]->[dac~]). The app works fine. I get the voice from mic in my earpiece.
My questions are:
How can i catch the data from [adc~] into buffer array?
I want to send this buffer over network to another device and load it into [dac~].
How can i load the buffer array into [dac~]?
This action should be done in real/near time. Writefs~ and readfs~ to a disk don't fullfill.
well, a buffer in Pd is called [table].
first thing you need to is to instantiate a named table with agiven size.
e.g. the following will create a table named "foo" of 44100 samples length (1sec if you are running at 44.1kHz)
[table foo 44100]
you can write signals into that table with [tabwrite~] (which will start writing whenever it receives a [bang()
[adc~ 1]
|
| [bang(
| /
|/
[tabwrite~ foo]
and to read a signal from a table, use...[tabread~], or [tabplay~], or [tabread4~], or [tabosc~], or...
[bang(
|
[tabplay~ foo]
|
[dac~]
Related
Im trying to return a byte array from an android native service(C++) to an android application using AIDL. Im calculating the byte array from two files and trying to split the final array at the client side using the length of byte array of one file.For Eg resultFinal= lengthof privKey + privKey + pubKey
std::ifstream _privKey("/etc/myPrivkey", std::ios::in | std::ios::binary);
std::vector<uint8_t> _privKeyContents((std::istreambuf_iterator<char>(_privKey)), std::istreambuf_iterator<char>());
std::ifstream _pubKey("/etc/myPubkey", std::ios::in | std::ios::binary);
std::vector<uint8_t> _pubKeyContents((std::istreambuf_iterator<char>(_pubKey)), std::istreambuf_iterator<char>());
vector<uint8_t> certFinal;
uint8_t keysize=_privKeyContents.size(); Verified this and the keysize is 161
resultFinal.insert(resultFinal.begin(),_pubKeyContents.begin(), _pubKeyContents.end());
resultFinal.insert(resultFinal.begin(),_privKeyContents.begin(), _privKeyContents.end());
resultFinal.insert(resultFinal.begin(),keysize);
I assume at the client side the first element of the byte array will be the size of the _privKeyContents and using that
value i can split the byte array in to two. I was expecting the first element of the byte array to be 161 but instead im getting -95
Can someone help me to identify the issue ? or Is my approach is wrong ? Please let me know if there any other input needed from my end
Thanks In advance
Ps: I dont have much idea about C++.
I'm pretty troubled with android NFC transacting with chip's FIFO cache area. Another side(B) and android app(A): B was electrify to chip,and B write data to chip's FIFO cache area by C language. The chip's FIFO cache area can not save data after outage, and when chip's FIFO cache area send all data, the chip's FIFO cache area will clear.
The situation is, A close to chip, send a APDU command, now chip receives the command and produces a signal. Then, the B detects the chip's signal, and grabs the command, then, B gets the command's first byte(fb), more then, B write [fb+data+9000] to chip's FIFO cache area. Finally, the transaction of send data back to A manage with chip self and we don't know the chip how to manage the send back.
The problem is, when B's write [fb+data+9000] less than 15 bytes(means data only 12 bytes), A can receive the [fb+data+9000] from chip. But the [fb+data+9000] more than 15 bytes, A throws TagLostException.
The chip use ISO14443-4 protocol.
The command:
The transact code:
`
try {
isoDep.close();
isoDep.connect();
}catch (IOException e){
errorfound = true;
onMessageReceived.onError(e);
}
if (!errorfound) {
if (isoDep.isConnected()) {
try {
isoDep.setTimeout(1200);
response = isoDep.transceive(newtest1_apdu);
int status = ((0xff & response[response.length - 2]) << 8) | (0xff & response[response.length - 1]);
if (status != 0x9000) {
log.error("retrieve data ,read failure");
}else {
log.info("retrieve data, result=" + numeralParse.toReversedHex(response));
}
onMessageReceived.onMessage(response);
}
catch (TagLostException e) {
log.info("catch tag lost exception, tag isConnected=" + isoDep.isConnected());
onMessageReceived.onError(e);
}
catch (IOException e) {
log.info("catch IOException, isoDep isConnected=" + isoDep.isConnected());
onMessageReceived.onError(e);
}
}else {
log.error("isoDep not connect");
}
}
`
Android app(A) try to a variety of commands, contains this format: .
And another side(B) only gets first byte in command and write [fb+data+9000] to chip's FIFO cache area. this isn't timeout reason, except setTimeOut(1200), also try setTimeOut(5000) or not setTimeOut. Other, A and B were not appoint the APDU command specific meaning. Other, by different APDU command, A work well with read Public transportation card(may be this read to block area, and now work with cache area, both work way not same). Other, the configuration of chip is basic default. Other, with other card reader test, chip's send data out success.
I go to Google,Bing,Baidu,Android office issues,stackoverflow and so on to search answer, but cannot find. This problem very bothered us. Apologetic with my poor English. Please help, extremely thank you.
(the chip is FM11NC08.)
New progress, We found, giving up using APDU command, if A send 1 byte, A can receive maximum 16 bytes. And if A send 2 bytes, A can receive maximum 15 bytes. And if A send 15 bytes, A can receive maximum 2 bytes. The chip's FIFO cache area has 32 bytes space. After B receive A's data, B will clear FIFO cache area, then write data to FIFO cache area.
Thanks in advance.
Today, B changes the chip's communication rate(from 1M to 2M) and a part of codes. Then A work well with chip! So, we found the communication rate has an impact on NFC communication. If you have the same trouble with NFC communication, might to try our way!
Thanks for people who consider this problem in not-solve days.
I have a PCM datafile that I know is valid. I can play it, edit it into pieces, etc. and it will always play, as well as the individual pieces.
But when I try to translate it into shorts from bytes
bytes[i] | (bytes[i+1] << 8)
The file is 16 bit, single channel and 44100 sampling. I don't see anything that looks like a wave file visually.
As a test I record led among silencer with one very loud sound in the middle. Still the chart I made from my intake looked like every other chart when I try this. Am I somehow doing this wrong? Or misunderstanding what I'm reading/attempting?
All am looking to do is detect a very low threshold to find a word gap.
Thanks
My psychic powers suggest this is a big-endian vs little-endian thing.
If the source file stores samples in big-endian, this is likely what you want:
(bytes[i] << 8) | (bytes[i+1])
For what it's worth, WAV files are little-endian.
Other possibilities include:
I don't see your code, but maybe your code is only incrementing i by 1 instead of 2 on every loop iteration. (A common mistake I've made in my own code).
signed types or casting. Be explicit how you do the bit operations with respect to signed vs. unsigned. I'm not sure if "bytes" is an array of "unsigned char" or "char". Nor am I sure if "char" defaults to signed or unsigned. This might be better:
unsigned char b1 = (unsigned char)(bytes[i]);
unsigned char b2 = (unsigned char)(bytes[i+1]);
short sample = (short)((b1 << 8) | (b2));
I'm using the Android NDK MediaCodec API to decode an MP4 stream of varying resolutions.
I am using the built-in functionality to render to a surface, calling AMediaCodec_releaseOutputBuffer with the render flag set to true.
I have found that every time the resolution changes, several frames of the old resolution are output on a surface the size of the new resolution. It might look something like this for a step up in resolution:
+------------------+-------------+
| frame of old res | |
| displayed too | |
| small | |
+------------------+ |
| |
| size of new resolution |
+--------------------------------+
After a few frames, the video looks normal again. I suspect it's something to do with the surface changing in size when an input buffer is queued with a new resolution, rather than when the first output buffer of that resolution is dequeued.
Has anyone encountered this issue before?
Thanks for your help.
I found the following in the MediaCodec documentation:
For some video formats it is also possible to change the picture size mid-stream. To do this for H.264, the new Sequence Parameter Set (SPS) and Picture Parameter Set (PPS) values must be packaged together with an Instantaneous Decoder Refresh (IDR) frame in a single buffer, which then can be enqueued as a regular input buffer. The client will receive an INFO_OUTPUT_FORMAT_CHANGED return value from dequeueOutputBuffer() or onOutputBufferAvailable() just after the picture-size change takes place and before any frames with the new size have been returned.
Whenever I receive a new SPS and PPS I make sure not to pass them to the decoder straight away, but rather to prepend them to the next I-Frame.
This solved the problem - resolution changes are seamless :)
Does this particular decoder indicate support for MediaCodecInfo.CodecCapabilities.FEATURE_AdaptivePlayback?
I haven't encountered this issue myself (since I haven't tried decoding streams that change resolution), but as far as I know, this feature flag was introduced to signal proper handling for this use case. If you see this behaviour on a decoder with this feature flag, I would consider it a bug (or I haven't understood the actual implications of this feature flag properly).
Helo.
Im developing an application that transferes data over bluetooth(with a flight recorder device). When i am recieving a lot of data data(3000 - 40000 lines of text, depends of the file size) my aplication seems to stop recieving the data. I recieve the data with InputStream.read(buffer). For example: I send a command to the flight recorder, it starts sending me a file(line by line), on my phone i recieve 120 lines and then the app stucks.
Intresting is that on my HTC Desire the app stucks just sometimes, on the Samsung Galaxy S phone the application stucks every single time i try to recive more than 50 lines.
The code is based on the BluetoothChat example. This is the part of code where i am listening to the BluetoothSocket:
byte[] buffer = new byte[1024];
int bytes =0;
while(true)
{
bytes = mmInStream.read(buffer);
readMessage = new String(buffer, 0, bytes);
Log.e("read", readMessage);
String read2 = readMessage;
//searching for the end of line to count the lines(the star means the start of the checksum)
int currentHits = read2.replaceAll("[^*]","").length();
nmbrOfTransferedFligts += currentHits;
.
.
.
//parsing and saving the recieved data
I must say that i am running this in a while(true) loop, in a Thread, that is implemented in an android Service. The app seems to stuck at "bytes = mmInStream.read(buffer);"
I have tried to do this with BufferedReader, but with no success.
Thanks.
The app seems to stuck at "bytes = mmInStream.read(buffer);"
But that is normal behavior: InputStream.read(byte[]) blocks when there is no more data available.
This suggests to me that the problem is on the other end or in the communication between the devices. Is is possible that you have a communication problem (which is a bit different on the Galaxy vs. the Desire) that is preventing more data from being received?
Also, I would suggest that you wrap a try/catch around the read statement to be sure that you catch any possible IOException's. Though I guess you would have seen it in logcat if that were happening.
Speaking of logcat, I would suggest that you look at the logcat statements that Android itself it generating. I find that it generates a lot for Bluetooth and this might help you to figure out whether there really is any more data to be read().